- 2 Posts
- 16 Comments
HiddenLayer555@lemmy.mlto Technology@lemmy.ml•Iranian missile strikes tech park housing Microsoft office in southern IsraelEnglish2·5 days agoHave they tried power cycling it?
HiddenLayer555@lemmy.mlto Technology@lemmy.ml•How an Unknown Chinese Phonemaker Took Over AfricaEnglish11·10 days agoUnknown Chinese Phonemaker
Shit title and really highlights Bloomberg’s western supremacist views. Unknown to who? Clearly tons of people know about it if it took over Africa.
They have different brands than us because they’re a completely different region and market? Nope, never heard of it so it must be unknown.
HiddenLayer555@lemmy.mlto Technology@lemmy.ml•Activists are Designing Mesh Networks to Deploy During Civil UnrestEnglish30·10 days agoMesh networks can be built on zero trust principles and have everything E2EE. Kind of like Tor.
But the more realistic scenario is the police will just deploy jammers to completely disable all wireless communication.
HiddenLayer555@lemmy.mlto Technology@lemmy.ml•AI can't even run a vending machine -- Vending-Bench: A Benchmark for Long-Term Coherence of Autonomous AgentsEnglish5·17 days agoMy new baseless theory: We know that AI is trained on tons of novels and fictional stories. Is it possible that because all novels have significant conflicts and drama, and stories where some person just boringly does his boring job forever aren’t exactly bestsellers, the AI is maybe trying to inject drama even when it makes no sense, since it’s been conditioned that way through the training data? So it’s seeing these inconsequential issues and since every novel it’s ever “read” turns them into massive conflicts, it’s trying to follow suit?
HiddenLayer555@lemmy.mlto Technology@lemmy.ml•AI can't even run a vending machine -- Vending-Bench: A Benchmark for Long-Term Coherence of Autonomous AgentsEnglish3·17 days agoIn the same way your fridge needs a web browser.
Though the point of this is probably not that it will be a viable product, but managing a vending machine is one of those seemingly easy and straightforward tasks that make good starting applications to test the AI with. Basically, if it can’t even handle something as simple as a vending machine, it definitely can’t be trusted with anything more complex.
HiddenLayer555@lemmy.mlto Technology@lemmy.ml•AI can't even run a vending machine -- Vending-Bench: A Benchmark for Long-Term Coherence of Autonomous AgentsEnglish2·17 days ago“You call yourself a beverage machine?!”
“I call myself Bev.”
HiddenLayer555@lemmy.mlto Technology@lemmy.ml•Interferometer Device Sees Text from a Mile AwayEnglish6·20 days agoJust say “pfft, you believe in the government?”
HiddenLayer555@lemmy.mlto Technology@lemmy.ml•Former Chinese NVIDIA AI Engineers Are Now Working for Huawei, Reveals NVIDIA's Chief Scientist Bill Dally, Warning Chinese Competition Is Closing InEnglish8·23 days agoWarning Chinese Competition Is Closing In
You know everyone believes in capitalism when a new competitor in the free market is a dire warning and perceived as a threat.
Honestly, I’ve been doing some recreational thinking about this whole thing, and I find myself agreeing with you. You brought up good points I hadn’t considered, thanks!
You can’t put toothpaste back in the tube. The only question going forward is how AI will be developed and who will control it.
Fair enough, but even if the model is open source, you still have no control or knowledge of how it was developed or what biases it might have baked in. AI is by definition a black box, even to the people who made it, it can’t even be decompiled like a normal program.
It’s funny that you’d bring up the drug analogy because you’re advocating a war on drugs here.
I mean, China has the death penalty for drug distribution, which is supported by the majority of Chinese citizens. They do seem more tolerant of drug users compared to the US (I’ve never done drugs in China nor the US so I wouldn’t know), so clearly the decision to have zero tolerance for distributors is a very intentional action by the Communist party. As far as I know, no socialist country has ever been tolerant to even the distribution of cannabis, let alone hard drugs, and they have made it pretty clear that they never will.
Personally, I have absolutely no problem with that if the model is itself open and publicly owned. I’m a communist, I don’t support copyrights and IP laws in principle. The ethical objection to AI training on copyrighted material holds superficial validity, but only within capitalism’s warped logic. Intellectual property laws exist to concentrate ownership and profit in the hands of corporations, not to protect individual artists.
I never thought of it in terms of copyright infringement, but in terms of reaping the labour of proletarians while giving them nothing in return. I’m admittedly far less experienced of a communist than you, but I see AI as the ultimate means of removing workers from their means of production because it’s scraping all of humanity’s intellectual labour without consent, to create a product that is inferior to humans in every way except for how much you have to pay it, and it’s only getting the hype it’s getting because the bourgeoisie see it as a replacement for the very humans it exploited.
For the record, I give absolutely no shits about pirating movies or “stealing” content from any of the big companies, but I personally hold the hobby work of a single person in higher regard. It’s especially unfair to the smallest content creators because they are most likely making literally nothing from their work since the vast majority of personal projects are uploaded for free on the public internet. It’s therefore unjust (at least to me) to appropriate their free work into something whose literal purpose is to get companies out of paying people for content. Imagine working your whole life on open source projects only for no company to want to hire you because they’re using AI trained on your open source work to do what they would have paid you to do. Imagine writing novels your whole life and putting them online for free, only for no publisher to want to pay for your work because they have a million AI monkeys trained on your writing typing out random slop and essentially brute forcing a best seller. Open source models won’t prevent this from happening, in fact it will only make it easier.
AI sounds great in an already communist society, but in a capitalist one, it seems to me like it would be deadly to the working class, because capitalists have made it clear that they intend to use it to eliminate human workers.
Again, I don’t know nearly as much about communism as you so most of this is probably wrong, but I am expressing my opinions as is because I want you to examine them and call me out where I’m wrong.
HiddenLayer555@lemmy.mlto Technology@lemmy.ml•My AI Skeptic Friends Are All NutsEnglish91·25 days ago[Linked article] M3 Ultra Runs DeepSeek R1 With 671 Billion Parameters Using 448GB Of Unified Memory, Delivering High Bandwidth Performance At Under 200W Power Consumption, With No Need For A Multi-GPU Setup
Running the AI is not where the power demand comes from, it’s training the AI. Which, if you trained it only once it wouldn’t be so bad, but obviously every AI vendor will be training all the time to ensure their model stays competitive. That’s when you get into the tragedy of the commons situation where the collective power consumption goes out of control for tiny improvements in the AI model.
Meanwhile, corps clearly don’t care about IP here and will keep developing this tech regardless of how ethical it is.
“It will happen anyway” is not an excuse to not try to stop it. That’s like saying drug dealers will sell drugs regardless of how ethical it is so there’s no point in trying to criminalize drug distribution.
Seems to me that it’s better if there are open model available and developed by the community than there only being closed models developed by corps who decide how they work and who can use them.
Except there are no truly open AI models because they all use stolen training data. Even the “open source” models like Mistral and DeepSeek say nothing about where they get their data from. The only way for there to be an open source AI model is if there was a reputable pool of training data where all the original authors consented to their work being used to train AI.
Even if the model itself is open source and free to run, if there are no restrictions against using the generated data commercially, it’s still complicit in the theft of human-made works.
A lot of people will probably disagree with me but I don’t think there’s anything inherently wrong with using AI generated content as long as it’s not for commercial purposes. But if it is, you’re by definition making money off content that you didn’t create which to me is what makes it unethical. You could have hired that hypothetical person whose work was used in the AI, but instead you used their work to generate value for yourself while giving them nothing in return.
HiddenLayer555@lemmy.mlto Technology@lemmy.ml•My AI Skeptic Friends Are All NutsEnglish104·25 days agoThe stolen training data issue alone is enough to make the use of AI in business settings unethical. And until there’s an LLM that is trained on 100% authorized data, selling a product developed with AI is outrught theft.
Of course there’s also the energy use issue. Yeah, congrats, you used as much energy as a plane ride to generate something you could have written with your own brain with a fraction of the energy.
HiddenLayer555@lemmy.mlOPto Technology@lemmy.ml•Remember when flip phones came with a second battery that you can swap in when the first one dies?English31·1 month agoI wish I could still use a fairphone as a daily driver in Canada. I have a Fairphone 4, had it shipped all the way from Europe and used it for two years before my network suddenly stopped connecting to it so I ended up getting a new phone. To be fair, it’s very much unsupported in Canada and is only has European bands, and I had to use a shipment forwarding service to get it here. I still use it at home on Wi-Fi though.
HiddenLayer555@lemmy.mlto Technology@lemmy.ml•Microsoft Bans the Word “Palestine” in Internal EmailsEnglish4·1 month agoραℓєѕтιηє
HiddenLayer555@lemmy.mlto Technology@lemmy.ml•China's 40-story gravity batteries threaten lithium's energy reignEnglish1·2 months agoCouldn’t you get more energy density with compressed air? That way the entire volume of your warehouse is storing energy at the same time.
Learning is woke. True patriots stick to their misconceptions for life.