Replacement.ai

(replacement.ai)

737 points | by wh313 5 hours ago

68 comments

  • TechSquidTV 4 hours ago
    Personal belief, but robots coming for your jobs is not a valid argument against robots. If robots can do a job better and/or faster, they should be the ones doing the jobs. Specialization is how we got to the future.

    So the problem isn't robots, it's the structure of how we humans rely on jobs for income. I don't necessarily feel like it's the AI company's problem to fix either.

    This is what government is for, and not to stifle innovation by banning AI but by preparing society to move forward.

    • everdrive 3 hours ago
      Personal belief, but robots are coming to have sex with your wife is not a valid argument against robots. If robots can do your wife better and/or faster, they should be the ones doing the job. Specialization is how we get to the future.

      So the problem isn't robots, it's the structure of how your wife relies on you for lovemaking. I don't feel like it's necessarily the AI company's problem to fix either.

      This is what government is for, and not to stifle innovation by banning hot robot sex with your wife, but preparing your family for robot/wife lovemaking.

      • raincole 1 hour ago
        It's actually an extremely good analogy (but in the opposite way of what you imply), as you don't own your job or your wife. Banning AI for job security is like banning dildos because they make men feel insecure.
        • klipt 57 minutes ago
          It's not limited to men. A lot of feminists are also vocally against the idea of female sex bots because it makes them feel insecure.
          • Ghexor 6 minutes ago
            I see female sexrobots as a symptom, a manifestation of the male gaze. I've heard personal histories of being courted and finding the guy was out for sexual gratification and didn't much care for her life-perspective. A justified anger there, I think, at a culture that perpetuates and celebrates this form of relating. Insecurity, I suspect, befalls more prominently the indoctrinated women that are catering to male expectations of beauty and ease. The successful feminists don't care if you're only screwing sexbots. If you are, it'll be great to have you off the market anyway. A filter for the men who won’t meaningfully connect.
          • lazyfanatic42 18 minutes ago
            "a lot of feminists", source?
          • anonym29 44 minutes ago
            Any idea why a bipedal fleshlight makes them insecure? My head jumps to "could it because they fear that without sex, they have no value?" but that sounds so ridiculous to me. All human life matters, has value, and should be treated with kindness, respect, and dignity. I've met plenty of women who add value to the world in a bajillion other ways.

            Can someone help me understand this one?

            • raincole 28 minutes ago
              > All human life matters, has value, and should be treated with kindness, respect, and dignity

              This is reasoning.

              However people don't always act based on reasoning.

              And even if you act based on reasoning, you can't trust others to act based on the same kind of reasoning.

              ---

              If a more serious response, (some) feminists see sex robots as objectification of women and that's why they're against them.

            • bofadeez 38 minutes ago
              This man has met a woman who added value (slow clap)
        • atleastoptimal 1 hour ago
          There's a difference in scale and potential consequences though

          What if there were some robot with superhuman persuasion and means of manipulating human urges such that, if it wanted to, it could entrap anyone it wanted to complete subservience? Should we happily acquiesce to the emerging cult-like influence these new entities have on susceptible humans? What if your parents messaged you one day, homeless on the street because they gave all their money and assets to a scammer robot that via a 300IQ understanding of human psychology manipulated them to sending all their money in a series of wire transfers?

          • lucas_v 1 hour ago
            > superhuman persuasion and means of manipulating human urges such that, if it wanted to, it could entrap anyone it wanted to complete subservience

            Wow this potential/theoretical danger sounds a lot like the already existent attention economy; we're already manipulated at scale by "superhuman" algorithms designed to hijack our "human urges," create "cult-like" echo chambers, and sell us things 24/7. That future is already here

            • atleastoptimal 52 minutes ago
              Which is true. People are scammed by very low-quality schemes (like that French woman who sent 1 million to the scammers who claimed to be Brad Pitt who for some reason needed money for medical procedures).

              Humans have generally a natural wariness/mental immune system response to these things however, and for 99% of people the attention economy is far from being able to completely upend their life or send all their money in an agreement made in bad faith by the other party. I don't see why, if some AI were to possess persuasion powers to a superhuman degree, would be able to cause 100x the damage when directed at the right marks.

      • warent 3 hours ago
        Equating “business” to “profound human intimacy” might be one of the most HackerNews comments of all time
        • igor47 2 hours ago
          Jobs aren't just business. Humans derive a lot of meaning from being useful and valued.
          • warent 2 hours ago
            You’re right and validating the point.

            A specific part of GP’s comment keeps getting overlooked:

              So the problem isn't robots, it's the structure of how we humans rely on jobs for income.
            
            Humans being forced to trade time for survival, money, and the enrichment of the elite, is a bug. We are socially conditioned to believe it’s a feature and the purpose.

            Nobody is saying robots should replace human connection and expression

            Edit: tone

            • mlyle 1 hour ago
              > > Humans derive a lot of meaning from being useful and valued.

              Sure, humans relying on jobs for income is a problem with transitions. But people finding purpose in jobs is a problem, too.

              Right now how we get there is being "forced to' -- and indeed that's a bug. But if we transition to a future where it's pretty hard to find useful work, that's a problem, even if the basic needs for survival are still being met.

              I haven't had to work for 25 years. But I've spent the vast majority of that time employed. Times when I've not had purposeful employment as an anchor in my life have been rough for me. (First 2-3 months feels great... then it stops feeling so great).

              • warent 1 hour ago
                Thanks for sharing. Absolutely right; people need to feel useful and valued—not to mention, jobs can help us get out of the house and connect with people.

                Just to be clear, are you saying the only life work that you can find fulfillment in is work that can be perfectly automated and handled by AI? Do you have an example of what you mean?

            • beeflet 2 hours ago
              It is not a bug, it is a law of nature. The world has limited resources, time and labor being one of them.

              The technology proposes a source of labor for the elites so abundant that they will not need to trade their wealth with the eaters.

              However much resources you consume, it will be too much to buy for your labor. You will be priced out of existence.

              • warent 2 hours ago
                Elites unilaterally claiming and reaping the benefits of automation (i.e. consolidation of wealth) is not a law of nature.
                • beeflet 1 hour ago
                  It is. It's not something you can wave away with some new political system.

                  Automation results in centralization of power. It transforms labor-intensive work to capital-intensive work and reduces the leverage of the working class.

                  You could have a system that distributes wealth from automation to the newly-unemployed working class, but fundamentally the capital-owners are less dependent on the working class, so the working class will have no leverage to sustain the wealth distribution (you cannot strike if you don't have a job). You are proposing a fundamentally unstable political system.

                  It's like liebig's law of the minimum or any other natural law. You can try to make localized exceptions in politics, but you are futilely working against the underlying dynamics of the system which are inevitably realized in the long term.

                  • SkyeCA 50 minutes ago
                    > so the working class will have no leverage to sustain the wealth distribution

                    As has been seen time and time again throughout history the commoners will only put up with so much and when all else fails and they start suffering a bit too much leverage comes from the end of a barrel.

                  • visarga 43 minutes ago
                    I think you misread the situation. The move is towards open models, small efficient models, what makes you believe there will be a moat around AI for automation?
                • mattgreenrocks 46 minutes ago
                  Correct. If we don’t do anything, it effectively is about as immutable as a law of nature. But if enough people respond, the system will change in some way.

                  Note that the stench of inevitability likes to sneak into these discussions of systemic problems. Nothing is set in stone. Anyone telling you otherwise has given up themselves. The comment section attracts all kinds of life outlooks, after all. The utility of belief in some sort of agency (however small) shouldn’t be surrendered to someone else’s nihilistic disengagement.

                • jimnotgym 1 hour ago
                  Yet the elite won't share that benefit until someone makes them. History makes me think that won't happen until hunger motivates the masses from their apathy.
            • EchoReflection 1 hour ago
              it's only for "the enrichment of the elite" if one looks at life with a perspective of entitlement, resentment and disregard for the nature of...nature (existence,randomness, realty itself).

              just because humans can't "outdo" technology doesn't mean we should "blame" "the elite". that's literally how the great catastrophes of socialism, communism, Marxism, etc started

              Humans aren't "forced" to do anything, (depending on how you look at it). You could just lay down, "live" in your own excrement until you starve to death. That seems reasonable! Liberate the proletariat! Why doesn't everyone else work for me?!

              https://www.merriam-webster.com/dictionary/ressentiment

          • eli_gottlieb 1 hour ago
            For most people, most of the time, jobs are a lot closer to "just business" than "marital intimacy".
        • melagonster 1 hour ago
          People cannot find new jobs so quickly. They will starve and probably die. This is more like the right of having life.
        • beeflet 2 hours ago
          I think you overstate the profundity. And like business it is a market and much of the same rules apply.

          If you disagree, feel free to argue your point instead of just scoffing at the idea.

        • SkyeCA 2 hours ago
          My job is more important than intimacy. Intimacy won't keep me warm, or fed.
          • dmd 2 hours ago
            If intimacy doesn’t keep you warm you might be doing something wrong.
            • SkyeCA 53 minutes ago
              It's far less effective than the hydrocarbons that heat my house. My point being that I think people on HN like to underestimate how important work is when talking about replacing humans.

              It's not "just business", it's my ability to survive.

            • rkomorn 2 hours ago
              It's the ice cubes.
      • DiggyJohnson 14 minutes ago
        Even though I disagree with this extension, I do think it’s an interesting and completely valid metaphor for your point.

        How would we handle regulating sex bots? Complete ban on manufacturing and import of full size humanoid bots? They are large enough that it could be partially effective I guess. I’m imagining two dudes in a shady meetup for a black market sale of sex bot which is kinda funny but also scary because the future is coming fast.

        Or in this case, a husband having police investigate and apprehend the wife in the act? Crazy times.

      • OJFord 2 hours ago
        If your wife would rather leave you for a robot (or otherwise) then, err, yeah? That's not new with AI or robots or anything?
        • beeflet 2 hours ago
          You're correct that it is not new. Most people outside of hacker news are opposed to being cuckolded in their romantic relationships and automated out of their jobs.
      • brookst 2 hours ago
        Well that analogy of sex with one’s spouse being equivalent to the economic exchange of work for pay is going to keep me cringing for years. Ugh.
        • zwnow 44 minutes ago
          The whole point went above your head
        • warent 1 hour ago
          Runaway capitalism! Time for me to log off for another ~2 years
      • thekevan 3 hours ago
        Your marriage is a decision between you and your spouse and is a mutual decision.

        A job is a decision that your boss(es) made and can be taken without your consent. You don't have the ownership of your job that you do of your marriage.

        • recursivegirth 3 hours ago
          Both marriage and job contracts are mutually binding legal agreements. You have the agency within those dynamics that the law gives you, which varies by region/jurisdiction respectively.

          Your partner in some (most?) cases can absolutely make an executive decision that ends your marriage, with you having no options but to accept the outcome.

          Your argument falls a little flat.

          • Imustaskforhelp 2 hours ago
            I think I have discovered gold in the comments

            Someone makes a comment about how its okay for things to be replaced in specialization in business

            Then someone equates it to intimacy

            Then someone says its only possible in HN

            Then we get into some nifty discussion of can we argue about the similarity between marriage and job contracts and first they disagree

            Now we come to your comment which I can kinda agree about and here is my take

            Marriage and business both require some definition of laws and a trust in state which comes out of how state has a monopoly (well legal monopoly) over violence and how it can punish people who don't follow laws over it and how the past record of it handling cases have been

            As an example, I doubt how marriages can be a good mutually binding legal agreement in something like saudi arabia which is mysognistic. Same can be said for exploitations in businesses for countries, same countries like saudia arabia and qatar have some people from south asia like india etc. in a sort of legal slavery where they are forced to reside in their own designated quarters of the country and they are insanely restricted. Look it up.

            Also off topic but I asked LLM's to find countries where divorce for women are illegal and I confirmed it, as an example, divorce in philipines for non muslims are banned (muslim woman's divorces are handled via sharia law) I have since fact checked it as well via searching but it's just that divorce itself isn't an option in philipines but rather limiting marital dissolution to annulment or legal separation

            "In the Philippines, the general legal framework under the Family Code prohibits absolute divorce for the majority of the population, limiting marital dissolution to annulment or legal separation " [1]

            [1]: source: https://www.respicio.ph/commentaries/divorce-under-muslim-pe...

          • great_wubwub 2 hours ago
            I'm not sure where you live, but employee contracts in the US are very rare in tech. Unions, execs, and rock stars - that's about it. The rest of us are at-will and disposable. Worker protections in the US are limited to "the machine can't eat more than two of your fingers per day" and "you can't work people more than 168 hours in a week".
            • EgregiousCube 2 hours ago
              When you sign your offer letter, you're entering into an employment contract. What you're describing is regulatory limitations on what that contract can say, and how different contracts can have different terms.
              • onraglanroad 2 hours ago
                Now from what you've said I think they might be right. You don't get a full contract that you sign that details your job, leave entitlement etc in the US?
                • acuozzo 47 minutes ago
                  Unless it's a contracting or union position, what you get is something resembling a contract, but your agreement to it comes with the mutual understanding between you and your employer that no effective enforcement body exists to uphold your interests.

                  If it says "you work 40 hours per week and have 4 weeks of paid vacation" and your employer, EVEN IN WRITING, compels you to work 60 hour weeks and not take any vacation at a later date, then your only real option is to find work elsewhere. The Department of Labor won't have your back and you likely won't have enough money to afford a lawyer to fight on your behalf longer than the corporate lawyers your company has on staff.

                  Many programmers don't get treated this way because of the market, but abusive treatment of employees runs rampant in blue collar professions.

                  Need proof? Look at how few UNPAID weeks of maternity leave new mothers are entitled to under the law. This should tell you everything you need to know.

                  I have personally seen women return to work LESS THAN A WEEK after delivering a baby because they couldn't afford to not do so.

                  • onraglanroad 18 minutes ago
                    Oh I'm aware workers in the US are treated extremely badly. In the UK statutory maternity pay is 90% of full salary for the first 6 weeks and around $250 a week for the next 33 weeks.

                    But I was just trying to clarify if work contracts were a normal thing there. The original post said they weren't where you seem to be saying they are, but effectively unenforceable.

                • EgregiousCube 1 hour ago
                  We get that - it’s generally right there on the offer letter, often supplemented by a handbook that it refers to.
                  • onraglanroad 1 hour ago
                    I don't get it. Do you sign an offer letter or a contract?

                    So the normal routine here is you get an offer, if you accept you get sent a contract which is signed by the employer, if it's all ok you also sign and then you get your start date. Is it different in the US or the same?

            • dragonwriter 1 hour ago
              > I'm not sure where you live, but employee contracts in the US are very rare in tech.

              Single integrated written employment contracts are rare in the US for any but the most elite workers (usually executives); US workers more often have a mix of more limited domain written agreements and possibly an implied employment contract.

            • onraglanroad 2 hours ago
              > employee contracts in the US are very rare in tech

              Is that true? I've never had a job where I didn't sign a contract (in the UK and for multinationals including American companies). I wouldn't start without a contract.

              And I'm not in any rockstar position. It's bog standard for employees.

              • brookst 2 hours ago
                Do these contracts provide guarantees to you, or just to the employer? In the US it is entirely one sided and provides no protection from arbitrarily being fired without cause.
                • onraglanroad 2 hours ago
                  There are statutory rights that you have anyway, such as for a full time position you are entitled to 28 days leave, so the contract normally covers extra stuff (so I have 33 days under my current one).

                  Plus it covers things like disciplinary procedures, working hours etc. It's really weird to me that you don't have that. Are you sure it's normal?

                  To address your specific point, you can mostly be fired without reason if you're a new employee. You get more rights after 2 years so companies generally have a procedure to go through after that. You can always appeal to an employment tribunal but they won't take much notice if you've been there a couple of months and got fired for not doing your job.

        • beeflet 2 hours ago
          Impressive word games at play here, I almost didn't notice that you fail to explain what the actual difference is here.

          A job is also a mutual decision between the employee and the employer.

          A marriage can also be taken without your consent through divorce (unless you are orthodox jewish or something I think?).

        • latexr 2 hours ago
          > can be taken without your consent.

          Note that isn’t universally true, for either case. Without mutual agreement, in the EU you can’t fire someone just because, and in Japan you can’t divorce unless you have proof of a physical affair or something equally damming.

        • aoeusnth1 2 hours ago
          You don’t own your marriage either? What exactly is the distinction you’re trying to make here, that you can hang onto your marriage even if your spouse doesn’t want it?
        • ako 1 hour ago
          Bosses should be able to make decisions about jobs or AI. That ok.

          But as a society we have to ask ourselves if replacing all jobs with AI will make for a better society. Life is not all about making as much money as possible. For a working society citizen need meaning in their lives, and safety, and food, and health. If most people get too little of this, it may disrupt society, and cause wars and riots.

          This is where government needs to step in, uncontrolled enterprise greed will destroy countries. But companies don´t care, they'll just move to another country. And the ultra-rich don´t care, they'll just put larger walls around their houses or move country.

      • giancarlostoro 21 minutes ago
        You know that episode where Fry dates the robot, and they show him a really old PSA? Man, I never thought that it was within my lifetime.
      • rapatel0 3 hours ago
        If robots are a better partner, then maybe your are not that great a partner...
        • sssilver 3 hours ago
          Not necessarily, it’s either that, or the robots are so good it’s unattainable by the imperfect human flesh.
        • lwhi 2 hours ago
          If robots are better partner, then maybe you no longer understand what it means to be human.
          • Imustaskforhelp 2 hours ago
            Another interesting point: If someone think that robots can be a better partner, then you also no longer understand what it means to be human.

            Maybe it depends on what you want in a relationships. AI is sycophantic and that could help people who might have trust issues with humans in general or the other sex (which is happening way more than you might think in younger generations, whether that's involuntary celebates or whatever)

            I don't blame people for having trust issues but the fact that they can live longer in some idea of a false hope that robots are partners would just make them stuck even longer and wouldn't help them.

            Should there be regulations on this thing depends if this becomes a bigger issue but most people including myself feel like govt. shouldn't intervene in many things but still. I don't think its happening any time soon since AI big tech money and stock markets are so bedded together its wild.

            • lwhi 2 hours ago
              I really don't buy the idea of AI helping with 'X' issue, by becoming a surrogate, because at it's root this is still avoidance.

              It's not solving the problem, it's diverging away from it.

              • Imustaskforhelp 1 hour ago
                That is what I wrote if I wasn't clear. Thanks for putting it in clear words I suppose

                I 100% agree. I mean that was what I was trying to convey I guess if I didn't get side tracked thinking about govt regulation but yeah I agree completely.

                It's sort of self sabotage but hey one thing I have come to know about humans is that judging them for things is gonna push them even further into us vs them, we need to know the reasons behind why people feel so easy to conform to llm's. I guess sycophancy is the idea. People want to know that they are right and the world is wrong and most people sometimes don't give a fuck about other problems and if they do, then they try to help and that can involve pointing to reality. AI just delays it by saying something sycophantic which drives a person even further into the hole.

                I guess we need to understand them. Its become a reality dude, there are people already marrying chatbots or what not, I know you must have heard of these stories...

                We are not talking about something in the distinct future, its happening as we speak

                I think the answer to why is desperation. There are so many things broken in the society in dating that young people feel like being alone is better and chatbots to satisfy whatever they are feeling.

                I feel like some people think they deserve love and there's nothing wrong with that but then its also at the same time that you can't expect any person to just love you at the same time, they are right in thinking about themselves too. So those people who feel like they deserve love flock to chatbot which showers them sycophancy and fake love but people are down bad for fake love as well, they will chase anything that resembles love, even if its a chatbot.

                Its a societal problem I suppose, maybe internet fueled it accidentally because we fall in love with people over just texts so we have equated a person to texts and thus love, and now we have got clankers writing texts and fellow humans intrepreting it as love.

                Honestly, I don't blame them but I sympathesize with them. They just need someone to tell their day to. Underputting them isn't the answer but talking with them and asking them to take professional therapy as well in the process could be great but so many people can't afford therapy that they go to LLM's so that's definitely something. We might need to invest some funding to make therapy more accessible for everybody I guess.

              • majkinetor 1 hour ago
                Diverging away from is one way to solve it since it no longer affects you, no ? You can't remove the context. There are no problems in vacuum.
        • Spivak 2 hours ago
          This is a perfect microcosm of this discussion. You're going to be replaced by robots because they will be better than you at everything but also if they're better than you at anything then that's your personal moral failing and sucks to be you.

          HN Comment in 2125: Why would I have casual sex with a real guy, I can have a sexual partner who I can tailor perfectly to my in the moment desires, can role play anything including the guy in the romance novel I'm reading, doesn't get tired, is tall and effortlessly strong, has robotic dexterity, available 24/7, exists entirely for my pleasure letting me be as selfish as I want, and has port and starboard attachments.

          What makes you think that sex is some sacred act that won't follow the same trends as jobs? You don't have to replace every aspect of a thing to have an alternative people prefer over the status quo.

        • sejje 2 hours ago
          In a hypothetical future, the robot could link up to your wife's brain interface/"neuralink" for diagnostic level information, and directly tune performance instantly based on what's working.

          You can compete, but not for long IMO. (No pun)

      • xethos 3 hours ago
        "But that ignores why you have sex with your wife: for bonding, physical affection, and pregnancy"

        Sure, but we're also putting aside how people do worse without a sense of purpose or contribution, and semi-forced interaction is generally good for people as practice getting along with others - doubly so as we withdraw into the internet and our smartphones

      • gnarlouse 3 hours ago
        “Robots are coming”

        I see what you did there

      • jarjoura 1 hour ago
        Hah. You say this in such a way that you leave out the possibility that robots are actually just coming for you. Robots can do you, better and/or faster than your partner. Who cares if they're coming for your partner if you can equally have a robot make you feel and experience things you could only imagine.
        • zwnow 43 minutes ago
          Least gooner take on here
      • thelastgallon 2 hours ago
        The missing piece is the material science advances to get the 100% fidelity mouth/etc feel. Once we have that breakthrough, all men will just buy those devices (and video games) and chill. No need for any of these questionable biological services (marriage, etc) with no guarantee and a whole lot of heartache.

        Any company that solves this problem will be a $10T company.

        • rpcope1 32 minutes ago
          That will also likely be the very end of our society as we know it. If you thought TFR was in the shitter today, having robot prostitutes (because let's be real, that's what you're suggesting) will not only probably grant a select few some unreal control over many, but will be the end of any society dumb enough to permit it entirely.
      • rpcope1 26 minutes ago
        I'm unsure if the comparison to more standard automation is totally apt (but agree if you rob people of purpose which many get from their jobs, lots and lots of bad shit happens), but what you're poking at is frankly nightmare fuel. Look how much chatbots that are in sycophantic "romantic relationships" absolutely break people (there was some subreddit linked once recently full of these types), and consider what happens when they're actually manifested. I'm sure some lolbert types will be like "but muh freedoms", but that sort of shit is so dangerous and so destructive on the same way brown glass bottles are to Australian jewel beetles, that we damn well better crush any of that and anyone that proposes producing robots like that. They're basically weapons and should be treated as such.
      • pessimizer 2 hours ago
        Extremely weird comment. But no, you don't own your job or your wife. If robots are better than you, she should leave you and we should all be happy for her.

        The problem is a culture that doesn't think the profit from productivity gains should be distributed to labor (or consumers), and doesn't think that wives deserve to be happy.

      • soganess 3 hours ago
        Honestly, from what I understand, most men’s relationship to “lovemaking” isn’t exactly winning awards. Plus, if the tables were turned, I’m sure some SV types would just call it “rational” or “logical” and magically develop nuanced yet expansive understandings of consent, autonomy, and ownership (“your wife”) overnight.

        Assuming the Everdrive is M and the SNES cartridge port is F, I can understand why the Everclan men are particularly attuned to this topic. Many better-quality, more feature-rich, and cheaper SNES multicarts have hit the market; the Everdrive is looking dated.

      • riversflow 3 hours ago
        What a non-sequitar. Your wife is an adult human being. Sex is a consensual act. Not really comparable to trading time and labor for currency.
        • avmich 2 hours ago
          Both job and sex are rather common and desired, so some comparison can be made.
      • amarant 3 hours ago
        I mean, she already has 3 tiny pink little robots designed specifically for the purpose.

        This isn't exactly news

        • boogieknite 1 hour ago
          "this guy fucks" - Russ Hannamen, HBO's Silicon Valley

          surprised to see this so far down. if a robot can fuck better, then we would probably both have fun fucking robots together

          • amarant 48 minutes ago
            Indeed! And "it might lead to better dildos" is genuinely the weirdest argument against AI I've heard so far.
      • nakamoto_damacy 20 minutes ago
        I will pay big money to see you do standup comedy. You made my day. LOL.
      • anonnon 1 hour ago
        Given the overlap between reddit and HN ("Orange Reddit"), using cuckolding as a negative point of comparison probably wasn't the best choice.
        • rpcope1 18 minutes ago
          Honestly reading the comments in this thread has convinced me that you're right that the blight between the two is nearly one in the same.
      • hasmolo 3 hours ago
        look it's a straw man!
        • mlyle 3 hours ago
          But it's not, really.

          Machines doing stuff instead of humans is great as long as it serves some kind of human purpose. If it lets humans do more human things as a result and have purpose, great. If it supplants things humans value, in the name of some kind of efficiency that isn't serving very many of us at all, that's not so great.

          • pessimizer 1 hour ago
            First I've heard that wives aren't human.
          • SJC_Hacker 2 hours ago
            It does serve a human purpose, for the ownership class, not for the laborers
      • mock-possum 1 hour ago
        This, only unironically

        Besides, in this fantasy, what’s to stop you from having the perfect robot lover as well - why are you so attached to this human wife of yours in the first place?

      • AfterHIA 3 hours ago
        I wonder if the AI can teach you how to use commas correctly.
      • kristianc 1 hour ago
        > If robots can do your wife better and/or faster, they should be the ones doing the job.

        Skill issue.

    • JimDabell 2 hours ago
      > robots coming for your jobs is not a valid argument against robots.

      Taking work away from people is practically the definition of technology. We invent things to reduce the effort needed to do things. Eliminating work is a good thing, that’s why inventing things is so popular!

      What ends up happening is the amount of work remains relatively constant, meaning we get more done for the same amount of effort performed by the same amount of people doing the same amount of jobs. That’s why standards of living have been rising for the past few millennia instead of everybody being out of work. We took work away from humans with technology, we then used that effort saved to get more done.

      • ori_b 23 minutes ago
        > What ends up happening is the amount of work remains relatively constant

        That's a pretty hard bet against AGI becoming general. If the promises of many technologists come to pass, humans remaining in charge of any work (including which work should be done) would be a waste of resources.

        Hopefully the AGI will remember to leave food in the cat bowls.

      • lamename 2 hours ago
        I agree with most everything you said. The problem has always been the short-term job loss, particularly today where society as a whole has resources for safety nets, but hasnt implemented them.

        Anger at companies who hold power in multiple places to prevent and worsen this situation for people is valid anger.

        • nativeit 1 hour ago
          There's another problem with who gets to capture all of the resulting wealth from the higher tech-assisted productivity.
      • erichocean 1 hour ago
        To date, we have only been able to make the work that humans do more effective (by creating technology, machines, etc.).

        Now we are on the cusp of replacing humans end-to-end, so that they are no longer required at any point in the work process.

        Replaced by AI that is cheaper and faster and more reliable, and (like any technology) that gets cheaper and faster and more reliable every year.

        Humans literally obsolete, uncompetitive, economically useless—with no hope of ever "catching up."

        • YZF 47 minutes ago
          We've replaced human jobs in the past not just made them more efficient. Horses (and all the jobs to do with them) were completely displaced by cars. Those jobs aren't more efficient, they're gone. Similar with many other jobs during the industrial revolution.

          This is not a zero sum game. For an economy to exist we need consumers. For consumers to exist we need people to have jobs and be paid. So the equilibrium is that there will be some new jobs somewhere, not done by robots, that will pay people enough to consume the (better and cheaper) goods made by those robots. Or we'll just have a lot of leisure time and get paid by the government. Or (like some other discussion thread) we'll all be wiped out or slaves in the salt mines if the elites can just sustain/improve without us and are able to enforce it. Either way, it's not the scenario where we're out of jobs sitting at home.

          • rpcope1 13 minutes ago
            > This is not a zero sum game. For an economy to exist we need consumers.

            I think that's really unimaginative and not thinking about it right. If you control the basic resources and own the tools needed to convert those into whatever you want, why does the "economy" even matter? If you can get anything you want without needing anything from the other 95% of people, having "consumers" in the sense you're thinking doesn't matter any more.

    • didibus 1 hour ago
      > Once men turned their thinking over to machines in the hope that this would set them free. But that only permitted other men with machines to enslave them.

      -- Frank Herbert, Dune

      The "government" is just the set of people who hold power over others.

      Those who will own machines that can match humans will have unprecedented power over others. In effect they'll grow more and more to be the government.

      Even now, companies hold more and more of the power over others and are more part of the government than ever before.

      So it confuses me when you say it's what the government is for? Who would that be? If we pretend it would still be a democracy then I guess you're saying it's everyone's problem?

      So here we are, let's discuss the solution and vote for it?

      • jimnotgym 1 hour ago
        Voting for it has become really difficult in countries with a First Past The Post voting systems, where the only parties that could win are comprehensively captured by the elite
      • rpcope1 11 minutes ago
        Butlerian Jihad fucking when? I'm ready.
      • bittercynic 1 hour ago
        >The "government" is just the set of people who hold power over others.

        Often, yes, but in a more functional society it would be the mechanism we collectively use to prevent a few people from amassing excessive wealth and power.

        • serial_dev 1 hour ago
          Where did this ever work out? It’s extremely common that policies are universally hated by the people, yet the people in power do it anyway.
          • pydry 43 minutes ago
            Switzerland.

            America isnt really a democracy it's a plutocracy.

        • 010101010101 1 hour ago
          Those two things don’t sound mutually exclusive to me.
    • notthemessiah 16 minutes ago
      The problem is that the AI companies are most interested in displacing the existing labor force more so than they are interested in developing new uses of AI in areas that humans are inherently bad at. They are more interested in replacing good jobs with AI rather than bad jobs. It's not that machines are doing jobs better, they are just doing them for cheaper and by cutting more corners.

      Best summarized in this comic: https://x.com/9mmballpoint/status/1658163045502267428

    • Loughla 3 hours ago
      The problem is that AI and advanced robotics (and matter synthesis and all that future stuff) must come with a post scarcity mindset. Maybe that mindset needs to happen before.

      Either way, without that social pattern, I'm afraid all this does is enshrine a type of futuristic serfdom that is completely insurmountable.

      • citizenpaul 3 hours ago
        > post scarcity mindset

        A total shift of human mentality. Humans have shown time and again there is only one way we ever get there. A long winding road paved with bodies.

        • Imustaskforhelp 2 hours ago
          I searched some day and there is enough food to end world hunger once or (twice I know for a fact once, but I had a reddit article which said twice but I am not pulling that source but you get it)

          > Less than 5% of the US’ military budget could end world hunger for a year. [1]

          My friend we live in 1984, when the main character discovers that eurasia and others have enough food / resources to help everybody out but they constantly fight/lessen it just so that people are easy to be manipulated in times of war.

          I discovered this 5% statistic just right now and its fucking wild. US itself could end world hunger 20 times but it spends it all on military. I don't know if its wrong or not but this is fucking 1984. The big brother has happened. We just don't know it yet because the propaganda works on us as well, people in 1984 didn't know they were living in 1984.

          And neither do we.

          Have a nice day :)

          Sources [1]: https://www.globalcitizen.org/en/content/hunger-global-citiz...

        • Loughla 2 hours ago
          That's valid, but I can still hope for better and dread that result.

          I genuinely believe we'll see technological world war 3 in my child's lifetime. And I'm not a super supporter of that.

      • sambull 2 hours ago
        I think it's going to come with eradicate the 'wastrel' mindset.
      • dingnuts 2 hours ago
        [dead]
    • reactordev 3 hours ago
      What you end up with is a dozen people owning all the wealth and everyone else owning nothing, resulting in the robots not doing anything because no one is buying anything, resulting in a complete collapse of the economic system the world uses to operate. Mass riots, hunger wars, political upheaval, world war 3. Nuke the opposition before they nuke you.
      • skybrian 3 hours ago
        That’s one scenario, but there are others. There are lots of open-weight models. Why wouldn’t ownership of AI end up being widely distributed? Mabybe it's more like solar panels than nuclear power plants?
        • throwaway0123_5 3 hours ago
          In terms of quality of life, much/most of the value of intelligence is in how it lets you affect the physical world. For most knowledge workers, that takes the form of using intelligence to increase how productively some physical asset can be exploited. The owner of the asset gives some of the money/surplus earned to the knowledge worker, and they can use the money to affect change in the physical world by paying for food, labor, shelter, etc.

          If the physical asset owner can replace me with a brain in a jar, it doesn't really help me that I have my own brain in a jar. It can't think food/shelter into existence for me.

          If AI gets to the point where human knowledge is obsolete, and if politics don't shift to protect the former workers, I don't think widespread availability of AI is saving those who don't have control over substantial physical assets.

        • gnarlouse 3 hours ago
          My potato RTX vs your supercomputer cluster and circular chipfab/ai-training economy. Challenge:

          “Be competitive in the market place.”

          Go.

          • skybrian 37 minutes ago
            Can't find it now, but I read an article about someone adding AI help to a large appliance. Can't assume WiFi is set up, so it has to work offline. Frontier model not required.

            I don't think we will be building these things ourselves, but I think there will still be products you can just buy and then they're yours.

            It would be the opposite of the "Internet of things" trend though.

          • baconbrand 3 hours ago
            Your potato RTX that uses a finite amount of power, is already paid for, and does things that you know are useful verses your supercomputer cluster and circular funding economy that uses infinite power, is 10x more overleveraged than the dot com bubble, and does things that you think might be useful someday. Challenge:

            “Don’t collapse the global economy.”

            :)

            • gnarlouse 3 hours ago
              Excellent. Completely fair.
          • jayd16 2 hours ago
            What market? If this shocks employment numbers what can happen other than market collapse?
            • reactordev 2 hours ago
              This is the only outcome any economic model concludes. Complete market collapse. It will scream before it collapses (meaning it will shoot to the moon, then completely collapse). Way worse than the Great Depression because instead of 26% unemployment, it will be 80%.
        • 8note 1 hour ago
          the ai isnt the useful thing, the stuff you can do with it is.

          if your job is replaced by ai, you having ai at home doesnt change whether you're making money or not.

          the capital owner gets their capital to work more effectively, and you without capital don't get that benefit

        • vineyardmike 2 hours ago
          Economies of Scale have been such a huge influence on the last ~by 300 years of industrial change. In 1725 people sat at home hand crafting things to sell to neighbors. In 1825, capitalists were able to open factories that hired people. By 1925, those products were built in a massive factory that employed the entire town and were shipped all over the country on rail roads. By 2025 factories might be partially automated while hiring tens of thousands of people, cost billions to build and the products get distributed globally. The same trend applies to knowledge work as well, despite the rise of personal computing.

          Companies are spending hundreds of millions of dollars on training AI models, why wouldn’t they expect to own the reward from that investment? These models are best run on $100k+ fleets of power hungry, interconnected GPUs, just like factory equipment vs a hand loom.

          Open weight models are a political and marketing tool. They’re not being opened up out of kindness, or because “data wants to be free”. AI firms open models to try and destabilize American companies by “dumping”, and AI firms open models as a way to incentivize companies who don’t like closed-source models to buy their hosted services.

          • skybrian 18 minutes ago
            I think a lot of people will be okay with paying $20 a month if they're getting value out of it, but it seems like you could just buy an AI subscription from someone else if you're dissatisfied or it's a bit cheaper?

            This is not like cell service or your home ISP; there are more choices. Not seeing where the lock-in comes from.

        • intended 2 hours ago
          Because the open models are not going to make as much money?
        • candiddevmike 3 hours ago
          If we're in fantasy land about AI, why do we keep thinking anyone will actually _own_ AI? Human hubris alone cannot contain a super intelligent AI.
          • gnarlouse 3 hours ago
            What makes you assume the AI companies actually want to create a super intelligence they can’t control. Altman has stated as much. Musk definitely wants to remain in power.
          • wahnfrieden 3 hours ago
            I sometimes wonder about what our governments would do if one of the businesses in their jurisdictions were to achieve AGI or other such destabilizing technology. If it were truly disruptive, why would these governments respect the ownership of such property and not seize it - toward whatever end authority desires. These businesses have little defense against that and simply trust that government will protect their operations. Their only defense is lobbying.
            • NoOn3 2 hours ago
              As we have almost seen, they will do absolutely nothing because they are afraid of losing to competing countries.
              • wahnfrieden 1 hour ago
                AGI is end game scenario. That is "winning". If a business wins it, then the government may not remain subservient to it no matter what free market conditions it had preserved beforehand, as long as it has the power to act.
      • cloverich 1 hour ago
        That is the system we have today, directionally. AI is an opportunity to accelerate it, but it is also an opportunity to do the opposite.
      • erichocean 1 hour ago
        Why not just feed, clothe, and house the poor? It's not like it requires human labor to do so, just point your robots at it.
      • throwaway-0001 3 hours ago
        Robots will do stuff for rich people ecosystem.

        The rest you know what’s going to happen

        • philipkglass 3 hours ago
          If robots can do all industrial labor, including making duplicate robots, keeping robots the exclusive property of a few rich people is like trying to prevent poor people from copying Hollywood movies. Most of the world doesn't live under the laws of the Anglosphere. The BRICS won't care about American laws regarding robots and AI if it proves more advantageous to just clone the technology without regard for rights/payment.
          • throwaway0123_5 3 hours ago
            I don't see how owning a robot helps me with obtaining the essentials of life in this scenario. There's no reason for a corporation to hire my robot if it has its own robots and can make/power them more efficiently with economy of scale. I can't rent it out to other people if they also have their own robots. If I already own a farm/house (and solar panels to recharge the robots) I guess it can keep me alive. But for most people a robot isn't going to be able to manufacture food and shelter for them out of nothing.
      • dfilppi 3 hours ago
        [dead]
    • strogonoff 2 hours ago
      “Robots coming for your jobs” is a valid argument against robots even if they can do those jobs better and faster, under two assumptions: 1) humans benefit from having jobs and 2) human benefit is the end goal.

      Both are fairly uncontroversial: many humans not only benefit from jobs but in fact often depend on jobs for their livelihoods, and (2) should be self-evident.

      This can change if the socioeconomic system is quickly enough and quite substantially restructured to make humans not depend on being compensated for work that is now being done by robots (not only financially but also psychologically—feeling fulfilled—socially, etc.), but I don’t see that happening.

      • jhbadger 43 minutes ago
        This is valid only so far as "human benefit" is localized to the human doing the job. I'm a cancer researcher. Obviously. my job is of value to me because it pays my bills (and yes, I do get satisfaction from it in other ways). But if an AI can do cancer research better than me, then the human benefit (to every human except perhaps me) favors the AI over me.

        But a lot of jobs aren't like that. I doubt many people who work in, say, public relations, really think their job has value other than paying their bills. They can't take solace in the fact that the AI can write press releases deflecting the blame for the massive oil spill that their former employer caused.

    • harryf 3 hours ago
      It’s worth (re)watching the 1985 movie Brazil in particular the character of Harry Tuttle, hearing engineer https://youtu.be/VRfoIyx8KfU

      Neither government or corporations are going to “save us” simply because sheer short termism and incompetence. But the seem incompetence will make the coming dystopia ridiculous

      • exe34 3 hours ago
        I do wonder if somewhere like China might be better off - they might not have muh freedumb, but their government seems keen to look after the majority and fund things that corporations wouldn't.
        • thijson 2 hours ago
          I believe that if the elites in China didn't need the populace, they would dispense with them.
        • 01284a7e 1 hour ago
          "China might be better off"

          Uh...

    • leobg 3 hours ago
      Is it just income that’s the issue? I’d rather say it’s purpose. Even more: What will happen to democracy in a world where 100% of the population are 27/7 consumers?
      • IHLayman 3 hours ago
        “ What will happen to democracy in a world where 100% of the population are 27/7 consumers?”

        …we’ll add three hours to our day?

        Bu seriously, I support what you are saying. This is why the entire consumer system needs to change, because in a world with no jobs it is by definition unsustainable.

        • ryandrake 2 hours ago
          Not if it's ends up a literal utopia where robots do all the work and all humans share equally in the benefits, and get to live their lives with a purpose other than toiling for N hours a week to earn M survival tokens, which is what we have today. Good luck coming up with the political will to actually implement that utopia, though.
      • creer 1 hour ago
        > What will happen to democracy in a world where 100% of the population are 27/7 consumers?

        What does the one have to do with the other?

        But even then, currently plenty of people find their fun in creating - when it's not their job. And they struggle with finding the time for that. Sometimes the materials and training and machines for that also. Meanwhile a majority of current jobs involve zero personal creativity or making or creating. Driving or staffing a retail outlet or even most cooking jobs can't really be what you are looking for on your argument?

        Is the road to post-scarcity more likely with or without robots?

      • Gepsens 3 hours ago
        Smaller cities, human size, humans closer to nature, robots bring stuff from factories by driving. Done
        • yoz-y 1 hour ago
          Seeing how the humans behave when an extreme minority is able to control the production and the army…
    • yoyohello13 3 hours ago
      I also agree with this, but I think there is a need to slow the replacement, by a bit, to reduce the short term societal harm and allow society to catch up. Robots can’t do the jobs if society collapses due to unrest.

      Progress is great obviously, but progress as fast as possible with no care about the consequences is more motivated by money, not the common good.

      • erichocean 1 hour ago
        > Robots can’t do the jobs if society collapses due to unrest.

        Of course they can, robots aren't dependent on humans at all. That's the point.

        It's not "replacement" if they are.

      • iwontberude 3 hours ago
        What you mean you don’t want to take a Great Leap Forward?
        • CamperBob2 3 hours ago
          A Great Leap Forward is what you get when you give a few fanatical ideologues too much power. I don't see anything in my history book about robots or AI being connected with that. They seem like different topics altogether.
          • yoyohello13 3 hours ago
            > give a few fanatical ideologues too much power.

            Uh, have you seen the US lately? I think that ship has sailed.

            • CamperBob2 1 hour ago
              Well, we're certainly taking a Great Leap, so that part checks out at least.
    • rafaelmn 1 hour ago
      The government and all social structures developed because your labour has value and division of labour/specialisation is so effective that it outperforms the alternatives. Cooperation beats violence, iterated prisoners dilemma, etc.

      None of this holds if you don't have anything of value to offer and automation is concentrating power and value, AI is the extreme end of this - at some point the charade of democracy becomes too annoying to the ones at the top, and you get there faster by trying to reign them in.

    • philjackson 3 hours ago
      > This is what government is for

      They're busy selling watches whilst people can still afford them thanks to having jobs.

    • grafmax 1 hour ago
      Problem is billionaires have co-opted our government. Their interest is in channeling money from the working class into their hands through rentier capitalism. That is contrary to widely structuring income.

      Rent extraction hurts them in the long run. Because working class income gets absorbed by various forms of rent, they are more expensive to employ. Thus we fail to compete with, say, China, which socializes many costs and invests in productive industry. We are left with a top heavy society that as we can see is already starting to crumble.

    • arrosenberg 1 hour ago
      Kind of hard for the government to “prepare society to move forward” when the AI companies and their financiers lobby for conditions that worsen the ability of society to do so.
    • thayne 2 hours ago
      I have very little faith in the government to fix that problem.

      And even if the government did institute something like universal basic income, if all jobs were replaced, that would almost certainly mean a lower standard of living for the middle class, and even less socioeconomic mobility than there is now.

    • jvanderbot 11 minutes ago
      It's one thing to be fired and completely replaced by a robot, never to work again. It's another to have your industry change and embrace automation, but to remain in with higher productivity and a new role. You might not initially like the new industry or role, but ....

      That's noble. The first is dystopian

    • jayd16 2 hours ago
      If wealth inequality was greatly reduced, we wouldn't have to worry about a lot of these topics, nearly as much.
    • 1dom 2 hours ago
      > This is what government is for, and not to stifle innovation by banning AI but by preparing society to move forward.

      What if the issue isn't government failing to prepare society to move forward, but instead, AI businesses moving us in a direction that more and more people don't consider to be forward?

    • lwhi 3 hours ago
      Government no longer has the power or authority to constrain private enterprise; especially in highly technical sectors.
      • StevePerkins 44 minutes ago
        Of course it does. Do you think the elites actually WANT massive tariffs putting a brake on GDP growth? Why are tech companies suddenly reversing course on content moderation and/or DEI, after years of pushing in the opposite directions?

        Private enterprise will always have some level of corrupting influence over government. And perhaps it sees current leadership as the lesser of two evils in the grand scheme. But make no mistake, government DOES ultimately have the power, when it chooses to assert itself and use it. It's just a matter of political will, which waxes and wanes.

        Going back a century, did the British aristocracy WANT to be virtually taxed out of existence, and confined to the historical dustbin of "Downton Abbey"?

      • xpe 2 hours ago
        Of course government has the authority to represent the people; if not it, then who or what does?
        • lwhi 2 hours ago
          I didn't say government doesn't have the power to represent the people, now did I?
          • xpe 1 hour ago
            You wrote:

            > Government no longer has the power or authority to constrain private enterprise; especially in highly technical sectors.

            Perhaps you meant only this: "Government no longer has the power ..."? It is clear government still has the authority based on the will of the people.

    • dotancohen 3 hours ago

        > This is what government is for, and not to stifle innovation by banning AI but by preparing society to move forward.
      
      Many will argue that the purpose of government is not to steer or prepare society, but rather to reflect the values of society. Traditionally, the body that has steered (prepared or failed to prepare) society for impending changes was religion.
      • nwatson 3 hours ago
        Enter the Dominionists, gaining steam now. Not a regime I want to live under. Here's a forty year old article describing the inception of those religious figures close to the current USA administration ... https://banner.org.uk/res/kt1.html
    • overfeed 3 hours ago
      Where would governments find the money to expand safety nets by 2-3 orders of magnitude, while losing most income tax inflows?
      • dragonwriter 3 hours ago
        > Where would governments find the money to expand safety nets by 2-3 orders of magnitude, while losing most income tax inflows?

        Well, it would start by not tax-favoring the (capital) income that remains and would have to have grown massively relatively to the overall economy for that to have occurred.

        (In fact, it could start by doing that now, and the resulting tax burden shift would reduce the artificial tax incentive to shift from labor intensive to capital intensive production methods, which would, among other things, buy more time to deal with the broader transition if it is actually going to happen.)

      • risyachka 3 hours ago
        You need income to: - buy house - get food - buy clothes - medical care - buy nice things

        if robots are that advanced that can do most of the jobs - the cost of goods will be close to zero.

        government will product and distribute most of the things above and you mostly won't need any money, but if you want extra to travel etc there will always be a bunch of work to do - and not 8 hours per day

        • dragonwriter 3 hours ago
          > if robots are that advanced that can do most of the jobs - the cost of goods will be close to zero.

          No, the cost of goods will be the cost of the robots involved in production amortized over their production lifetime. Which, if robots are more productive than humans, will not be “near zero” from the point of view of any human without ownership of at least the number of robots needed to produce the goods that they wish to consume (whether that’s private ownership or their share of socially-owned robots). If there is essentially no demand for human labor, it will, instead, be near infinite from their perspective .

          • risyachka 1 hour ago
            The cost of robots will be zero because robots will mine ore, process it and make other robots.
            • dragonwriter 1 hour ago
              > The cost of robots will be zero because robots will mine ore, process it and make other robots.

              This assumes, among other things that are unlikely to be true, that the only cost of extraction is (robot) labor, and that mining rights are free, rather than being driven by non-zero demand and finite supply.

              (Another critical implicit assumption is that energy is free rather than being driven by non-zero demand and finite supply, as failing that will result in a non-zero price for robot labor even if the materials in the robot were free.)

        • gmerc 3 hours ago
          No. You become worthless to the government and will be treated accordingly
        • yoyohello13 3 hours ago
          Sure... in 200 years. The trick is getting through the cultural/economic shifts required to get to that point without blowing up the earth in WW3.
          • risyachka 1 hour ago
            I think the trick is to start preparing like it will happen in 10 years and hope it will take 200, otherwise wars are imminent.
        • beeflet 3 hours ago
          There are still capital costs in producing the robots and infrastructure. So the costs of goods will be nonzero.

          >government will product and distribute most of the things above and you mostly won't need any money

          So basicially what you are saying is that a government monopoly will control everything?

          • risyachka 1 hour ago
            Government is people, it is not some corporation.

            >> government monopoly

            there is no monopoly if there is no market

        • candiddevmike 3 hours ago
          I'm pretty sure if robots are capable of doing most jobs, the only ones left will be related to prostitution or fluid/tissue/organ donation.
        • pojzon 3 hours ago
          Rich ppl doing something to undermine their status as „the better ones”?

          This is not going to happen.

          We all know a post-apocalyptic world is what awaits us.

          More or less Elysium is the future if ppl will still behave the same way they do now.

          And I doubt ppl will change in a span of 100 years.

    • jaccola 3 hours ago
      The way I think about this is either the job is done in the most efficient way possible or I am asking everyone else to pay more for that product/service (sometimes a worse product/service) just so I can have a job.

      E.g. if I was a truck driver and autonomous trucks came along that were 2/3rds the price and reduced truck related deaths by 99% obviously I couldn't, in good faith, argue that the rest of the population should pay more and have higher risk of death even to save my job and thousands of others. Though somehow this is a serious argument in many quarters (and accounts for lots of government spending).

      • code4life 2 hours ago
        This is what minimum wage deals with to some extent. Governments decide what jobs may exist based on how much a company can pay to do the job.
        • erichocean 1 hour ago
          Sure, but there's no "minimum wage" for technology, only human employment.

          And we aren't talking about human employement.

    • beeflet 3 hours ago
      Firstly, they are not coming for my job, they're coming for all jobs.

      Secondly, you assume in the first place that we can somehow build a stable post-scarcity society in which people with no leverage can control the super-intelligent agents with all of the power. The idea that "government will just fix it" is totally ignorant of what the government is or how it emerges. In the long run, you cannot have a ruling class that is removed from the keys to power.

      Lastly, Who says we should all support this future? What if I disagree with the AI revolution and it's consequences?

      It is kind of amazing how your path of reasoning is so dangerously misoriented and wrong. This is what happens when people grow up watching star trek, they just assume that once we live in a post scarcity future everything will be perfect, and that this is the natural endpoint for humanity.

      • derektank 2 hours ago
        >Firstly, they are not coming for my job, they're coming for all jobs.

        They're not coming for all jobs. There are many jobs that exist today that could be replaced by automation but haven't been because people will pay a premium for it to be done by a human. There are a lot of artisan products out there which are technically inferior to manufactured goods but people still buy them. Separately, there are many jobs which are entirely about physical and social engagement with a flesh and blood human being, sex work being the most obvious, but live performances (how has Broadway survived in an era of mass adoption of film and television), and personal care work like home health aids, nannies, and doulas are all at least partially about providing an emotional connection on top of their actual physical labor.

        And there's also a question of things that can literally only be done by human beings, because by definition they can only be done by human beings. I imagine in the future, many people will be paid full time to be part of scientific studies that can't easily be done today, such as extended, large cohort diet and exercise studies of people in metabolic chambers.

        • beeflet 2 hours ago
          >They're not coming for all jobs.

          So we are all going to just do useless bullcrap like sell artisan clay pots to each other and pimp each other out? Wow, some future!

          I just don't know how this goofball economy is going to work out when a handful elites/AI overlords control everything you need to eat and survive and everyone else is weaving them artisan wicker baskets and busking (jobs which are totally automated and redundant, have you, but the elites would keep us around for the sentimental value).

          >I imagine in the future, many people will be paid full time to be part of scientific studies that can't easily be done today, such as extended, large cohort diet and exercise studies of people in metabolic chambers.

          Yeah this is one plausible future, we could all be lab rats testing the next cancer medicine or donating organs to the elites. I can't imagine the conditions will be very humane, being that the unwashed masses will have basically no leverage to negotiate their pay.

    • vlovich123 2 hours ago
      My reading of history is that human society is able to adjust to small changes like this over long periods of time as young people choose alternate paths looking at what changes are likely on the horizon. Rapid changes frequently lead to destabilization of adults who are unwilling or unable to retrain which then also screws up their next generation who start off on an economic back foot and a worldview of despair and decrepitude.

      Not the AI company’s fault per se, but generally the US government does a very poor job of creating a safety net either intentionally, ineptitude or indifference.

      By the way, attacks were also leveled against Chinese and Japanese California workers who were viewed as stealing the jobs of other “Americans”. So this viewpoint and tradition of behavior and capitalism is very long in US history.

    • franga2000 3 hours ago
      "The government" needs time to fix this and until then, we need to not automate everyone out of a job. If that means we don't "get to the future" until then, fine. "Fault" or not, the AI companies are putting people in danger now and unless we can implement a more proper solution extremely quickly, they just have to put up with being slowed down.

      But it's not like "the government" (as if there is just one) simply doesn't want to fix things. There are many people who want to fix the way we distribute resources, but there are others who are working to stop them. The various millionaires behind these AI companies are part of the reason why the problem you identified exists in the first place.

      • brap 3 hours ago
        So your government should pump the brakes, while other governments rush towards ASI. And you believe this will benefit you, long term? Or do you believe in “global cooperation”?
        • franga2000 2 hours ago
          This is exactly why I put "the government" in quotes. The parent post was saying "the government" should just fix the underlying problem instead of showing AI progress. This has the same problem - no government can do that alone.

          So it's either "we all science ourselves out of a job and die from uncontrolled capitalism" or "we try to do something about the uncontrolled capitalism and in the meantime, everyone else keeps sciencing everyone out of a job". The result is the same, but some of us at least tried to do something about it.

          • vanviegen 10 minutes ago
            Not necessarily. You can also attempt to disconnect your economy from AI-driven economies. Yes, tariffs. :-)

            That path is hard and risky (are AI countries eclipsing us in military power?), but probably more realistic than hoping for global cooperation.

    • 6r17 1 hour ago
      The problem is that we do not share the same values at all and I do not envision this truly benefiting people neither making it something I envy or feel OK with. You can make the best fakes and make your best to remove activity from people - but ultimately this is going to lead to a deteriorated society, increased mental health issues, plenty of sad stories just so few people can be happy about it.

      Ngl, if someone nuked all USA's servers and wipes out the all these bullshit i'm not convinced the world would be in a worst state right now.

      Let AI be used by scientific research, development, helping people out. But if it's just to sit your smelly ideas down, you may even be right, but ultimately the form, intentions and result matter more than recklessly endangering everybody.

      TBH I feel like the AI discourse around human replacement smell like hard-core psychopathic behavior - or the one of a drunken dude who's driving a car just happy.

      You have 0 data concerning the result it would do on society - and I definitely prefer to live in a less technological world than a world that is full of people with psychosis.

      So until we find how we can solve this bottleneck I have 0 sympathy for this kind of discourse.

    • everforward 3 hours ago
      > This is what government is for, and not to stifle innovation by banning AI but by preparing society to move forward.

      This requires faith that the government will actually step in to do something, which many people lack (at least in the US, can't speak for globally).

      That's the sticking point for many of the people I've talked to about it. Some are diametrically opposed to AI, but most think there's a realistic chance AI takes jobs away and an unrealistic chance the government opposes the whims of capital causing people displaced from their jobs to dip into poverty.

      I can't say I have a good counter-argument either. At least in the US, the government has largely sided with capital for my entire life. I wouldn't take a bet that government does the kind of wealth redistribution required if AI really takes off, and I would eat my hat if it happens in a timely manner that doesn't require an absolute crisis of ruined lives before something happens.

      See the accumulation of wealth at the top income brackets while the middle and lower classes get left behind.

      TLDR this is more of a crisis of faith in the government than opposition to AI taking over crap jobs that people don't want anyways.

    • KoolKat23 2 hours ago
      And if we know we can't fix it fast enough, is a delay acceptable?
    • lukev 3 hours ago
      Agreed 100%, except that this is not a new development. This process has been ongoing, with automation has been taking over certain classes of jobs and doing them faster and better since the industrial revolution.

      And capitalism has flourished during this time. There's no reason to believe even more automation is going to change that, on its own.

      Sure, Musk and Altman can make noises and talk about the need for UBI "in the future" all they want, but their political actions clearly show which side they're actually on.

    • wizardforhire 3 hours ago
      Gonna just play a little mad libs here with your argument…

      Personal belief, but AI coming for your children is not a valid argument against AI. If AI can do a job better and/or faster, they should be the ones doing the parenting. Specialization is how we got to the future. So the problem isn't AI, it's the structure of how we humans rely on parenting for their children. I don't necessarily feel like it's the AI company's problem to fix either. This is what government is for, and not to stifle innovation by banning AI but by preparing society to move forward.

      You’re right about one thing within reason… this is what a rationale government should be for… if the government was by the people and for the people.

      Addendum for emphasis: …and if that government followed the very laws it portends to protect and enforce…

      • marnett 2 hours ago
        The artist behind replacement.ai chose a very relevant first use case — everyone thinks of AI replacement in terms of labor, but the example in terms of parenting and child rearing, which is arguably the only true reason for humans to exist, is genius.

        Procreation and progeny is our only true purpose — and one could make the argument AI would make better parents and teachers. Should we all capitulate our sole purpose in the name of efficiency?

    • zer00eyz 2 hours ago
      > Personal belief, but robots coming for your jobs is not a valid argument against robots.

      Replace the word robot with "automation" or "industrialization" and you have the last 200 years of human history covered.

      The Ludites could have won and we would all have 1500$ shirts.

      Do you know any lamp lighters? How about a town crier?

      We could still all be farming.

      Where are all the switch board operators? Where are all the draftsmen?

      How many people had programing jobs in 1900? 1950?

      We have an amazing ability to "make work for ourselves", and history indicates that we're going to keep doing that regardless of how automated we make society. We also keep traditional "arts" alive... Recording didnt replace live performances, TV/Film didnt replace Broadway... Photography didnt replace painting...

      • erichocean 1 hour ago
        > We have an amazing ability to "make work for ourselves"

        Humans will certainly be able to make leisure for themselves, but the idea they will be economically competitive performing work—any kind of work, pre-existing or to be created—is entirely the problem being discussed. AI employees will outperform any human at work, at lower cost, without any obvious downsides.

    • xpe 2 hours ago
      > This is what government is for, and not to stifle innovation

      We should compare how anti-government politicians talk versus how trained, educated neoclassical economists talk. The latter readily recognize that a valid function of government is to steer, shape, and yes, regulate markets to some degree. This is why we don’t have (for example) legal, open markets for murder.

      Markets do not define human values; they are a coordination mechanism given a diverse set of values.

    • romellem 3 hours ago
      “Guns don’t kill people, etc…”
    • ivape 3 hours ago
      it's the structure of how we humans rely on jobs for income

      1. We don’t need everyone in society to be involved in trade.

      2. We made it so that if you do not take part in trade (trade labor for income), you cannot live.

      3. Thus, people will fear losing their ability to trade in society.

      The question is, when did we make this shift? It used to just be slavery, and you would be able to survive so long as you slaved.

      The fear is coming from something odd, the reality that you won’t have to trade anymore to live. Our society has convinced us you won’t have any value otherwise.

      • brap 2 hours ago
        >We made it so that if you do not take part in trade (trade labor for income), you cannot live.

        We did not make it so, this has been the natural state for as long as humans have existed, and in fact, it’s been this way for every other life form on Earth.

        Maybe with post-scarcity (if it ever happens) there could be other ways of living. We can dream. But let’s not pretend that “life requires effort” is some sort of temporary unnatural abomination made by capitalists. It’s really just a fact.

        • ivape 2 hours ago
          You think every other life form on earth survives due to trading with each other? Most of human history has been some contingent contract between owner and labor, where most humans lived under some form of servitude or just pure slavery.

          Paradigm shift means, “I can live without being involved in a financial contract with another entity”. This is the milestone before us.

          • brap 2 hours ago
            It certainly survives by working, one way or another.

            My point is that until now, we have never been able to find a functioning system that frees us from work (trade is just one type of work, so is hunting for survival, or photosynthesis), and until something changes dramatically (like a machine that caters to our every need), I find it hard to believe this can change.

            • ivape 2 hours ago
              That’s fine. I understand. I have a ridiculous belief that the universe is finally here to free us from work. AI is absurd, and magical, and if it does what we think it can, then the paradigm will shift. We as custodians of the transition have to work diligently to make sure this paradigm shift is not perverted by forces that want to reenable the prior way (which was that, whatever can be reasonably had to live, be wrapped in a scalping structure where one side can extract additional value).

              One of the ways this shift will have momentum is that children today are going to be born into the light. They will live decades without the concept of having to make decisions around the scarcity of work and resources. They will not have the same values and viewpoints on society that, we, legacy components of the system, are currently engulfed by.

              Our generation will be the last to join all other prior generations, in the boat of economic slavery, and it will be allowed to drift and sail away from port for the final time. It was a long journey, generations and generation were involved. Lots of thieving and abuse, good riddance.

    • malloryerik 3 hours ago
      Might want to read some Karl Polanyi.
    • wahnfrieden 3 hours ago
      It's also what organized labor is for. Workers can't wait on government to offer aid without leverage. We would not have weekends off or other protections we now take for granted if we waited on government to govern for us as if it was a caring parent.

      So that would mean it is in fact the responsibility of the people at robot/AI companies (and across industries). It's not something we can just delegate to role-based authorities to sort out on our behalf.

    • sharts 2 hours ago
      Replace robots with immigrants and it’s the same fear mongering as usual.
      • xpe 2 hours ago
        There is much more to concerns about AI than fear mongering. Reasonable people can disagree on predictions about probabilities of future events, but they should not discount reasonable arguments.
    • pesfandiar 3 hours ago
      It's wrong to assume the owners will share the productivity gains with everyone, especially when reliance on labour will be at its lowest, and the power structure of a data/AI economy is more concentrated than anything we've seen before. IMO, the assumption that some form of basic income or social welfare system will be funded voluntarily is as delusional as thinking communism would work.
    • exe34 3 hours ago
      > This is what government is for, and not to stifle innovation by banning AI but by preparing society to move forward.

      The AI will belong to the parasite class who will capture all the profits - but you can't tax them on this, because they can afford to buy the government. So there isn't really a way to fund food and shelter for the population without taking something from the billionaires. Their plans for the future does not include us [0].

      [0] https://www.theguardian.com/news/2022/sep/04/super-rich-prep...

    • luxuryballs 1 hour ago
      “This is what government is for” was the most terrifying thing I’ve read all month. The only thing more starkly dystopian than relying on robots and AI for survival would be adding “the government” to the list.

      The government should keep its charge as the protector and upholder of justice, I don’t want it to be those things and then also become a fiat source for economic survival, that’s a terribly destructive combination because the government doesn’t care for competition or viability, and survival is the last place you want to have all your eggs in one basket, especially when the eggs are guaranteed by force of law and the basket becomes a magnet for corruption.

    • jwilber 1 hour ago
      Ah yes, our government where career politicians from both sides have bent the rules to create 9-10 figure fortunes.
    • gnarlouse 3 hours ago
      Respectfully, that’s not a very functional belief. It’s sort of the equivalent to saying “communism is how mankind should operate”, while completely ignoring why communism doesn’t work: greedy, self-preserving genetic human instincts.

      The workforce gives regular folks at least some marginal stake in civilization. Governments aren’t effective engines against AI. We failed to elect Andrew Yang in 2020–who was literally running on a platform of setting up a UBI tax on AI. Congress is completely corrupt and ineffectual. Trump gutting the government.

      You may be right about ai taking jobs eventually if that’s what you’re saying, but you come off pretty coldly if you’re implying it’s what “should” happen because it’s Darwinian and inevitable, and just sorta “well fuck poor people.”

      • NoOn3 2 hours ago
        Not only Communism doesn't work, Capitalism doesn't work without crises and wars too. :)
    • gtsop 3 hours ago
      > So the problem isn't robots, it's the structure of how we humans rely on jobs for income.

      It's called capitalism

    • sythnet 1 hour ago
      I agree with you
    • portaouflop 3 hours ago
      Specialization is for insects.

      A human being should be able to change a diaper, plan an invasion, butcher a hog, conn a ship, design a building, write a sonnet, balance accounts, build a wall, set a bone, comfort the dying, take orders, give orders, cooperate, act alone, solve equations, analyze a new problem, pitch manure, program a computer, cook a tasty meal, fight efficiently, die gallantly.

    • lwhi 3 hours ago
      The problem is more existential.

      Why are people even doing the jobs?

      In a huge number of cases people have jobs that largely amount to nothing other than accumulation of wealth for people higher up.

      I have a feeling that automation replacement will make this fact all the more apparent.

      When people realise big truths, revolutions occur.

  • allturtles 4 hours ago
    This is a brilliant piece of satire. "A Modest Proposal" for the AI age.

    The leader bios are particularly priceless. "While working for 12 years as the Director of HR for a multinational, Faith realized that firing people gave her an almost-spiritual high. Out of the office, Faith coaches a little league softball team and looks after her sick mother - obligations she looks forward to being free of!"

    • bilekas 1 hour ago
      > This is a brilliant piece of satire. "A Modest Proposal" for the AI age.

      There's some truth in all satire though. I'm just shocked YC hasn't nuked the link from the front page.

      • hn_throwaway_99 4 minutes ago
        > I'm just shocked YC hasn't nuked the link from the front page.

        I'm not. People dump on VCs and YC all the time here and it's frequently on the front page.

      • overfeed 49 minutes ago
        Everyone is way above average here on HN, and will be thinking "The article speaks about every other idiot I work with - my genius is singular; I'm too valuable to my employer and irreplaceable. I'll be the one wrangling the AI that replaces my colleagues, while they figure out welfare"
    • username223 2 hours ago
      "I have been assured by a very knowing American of my acquaintance in London, that a young healthy child well nursed, is, at a year old, a most delicious nourishing and wholesome food, whether stewed, roasted, baked, or boiled; and I make no doubt that it will equally serve in a fricassee, or a ragout."

      Would Sam Altman even understand the original, or would he just wander ignorantly into the kitchen and fling some salt at it (https://www.ft.com/content/b1804820-c74b-4d37-b112-1df882629...)? I'm not optimistic about our modern oligarchs.

      • oxag3n 2 hours ago
        Why did I read that FT article start to finish?

        Seems like a waste of time, but at the same time the feelings were similar to looking Hannibal Lecter in the kitchen scene.

  • sincerely 4 hours ago
    I kind of get it, but at the same time...isn't "we made a machine to do something that people used to do" basically the entire history of of technology? It feels like somehow we should have figured out how to cope with the "but what about the old jobs" problem
    • darthoctopus 4 hours ago
      that is the point of Luddism! the original Luddite movement was not ipso facto opposed to progress, but rather to the societal harm caused by society-scale economic obsolescence. the entire history of technology is also powerful business interests smearing this movement as being intrinsically anti-progress, rather than directly addressing these concerns…
      • Kiro 3 hours ago
        I think we should be careful attributing too much idealism to it. The Luddites were not a unified movement and people had much more urgent concerns than thinking about technological progress from a sociocentric perspective. Considering the time period with the Napoleonic Wars as backdrop I don't think anyone can blame them for simply being angry and wanting to smash the machines that made them lose their job.
        • serial_dev 1 hour ago
          And an important note: history is written by the victors. Additionally, just like how today some people have a caricatured understanding of the “other” side (whatever that might be), understanding what Luddites thoughts and motivations were through the lens of their victor opponents will inevitably create a biased, harsh picture of them.
        • leptons 2 hours ago
          >wanting to smash the machines that made them lose their job.

          Wondering how long before people start setting datacenters on fire.

          • wongarsu 1 hour ago
            And how well those attempts fare. Data centers aren't exactly fortified, but they have a lot of focus on access control and redundancy, and usually have good fire suppression (except OVH apparently).

            Maybe ChatGPT has some ideas on how to best attack data centers /s

          • andriesm 2 hours ago
            I find it hard to locate my sympathy button for people who smash and burn things built up by other people.
            • bee_rider 1 hour ago
              The act of destruction is not inherently evil, it is a matter of what it targets. You can burn down the Library of Alexandria or you can bust open a concentration camp. (These are just some extreme examples, some datacenter isn’t morally equal to either).
            • saulpw 2 hours ago
              Datacenters aren't built by people, they're built by corporations.
              • einsteinx2 1 hour ago
                Corporations which are entirely made up of people. Not to mention the people that physically built and maintain the data center.

                Or did the actual legal fiction of a corporation do it? Maybe the articles of incorporation documents got up and did the work themselves?

                • saulpw 1 hour ago
                  It means that no one cares about the creations except in terms of money. If an Oracle building burns down and no one is hurt, I wouldn't shed a single tear. If an artistic graffiti mural adorned its wall, I would be more upset.
                  • einsteinx2 1 hour ago
                    I get what you mean, but my point is even that Oracle building was designed, built, and maintained by the work of real people. Many of which I assume take pride in their work and may in fact care if it’s burned down.
        • pydry 46 minutes ago
          Exactly, the luddites werent especially anti technology. Smashing stocking frames for them was a tactic to drive up their wages.

          Just as the fallout of the napoleonic war was used as a means of driving down their wages. The only difference is that tactic didnt get employers executed.

          It's always been in the interests of capital to nudge the pitchforks away from their hides in the direction of the machines, and to always try and recharacterize anti capitalist movements as anti technology.

          In 2010 I remember a particularly stupid example where Forbes declared anti Uber protestors were "anti smartphone".

          Sadly most people dont seem to be smart enough to not fall for this.

      • orourke 4 hours ago
        I think the concern in this case is that, unlike before where machines were built for other people to use, we’re now building machines that may be able to use themselves.
        • Jordan-117 2 hours ago
          Not that much of a difference tbh. If one traditional machine allows one worker to do the work of twenty in half the time, that's still a big net loss in those jobs, even if it technically creates one.

          The real issue is that AI/robotics are machines that can theoretically replace any job -- at a certain point, there's nowhere for people to reskill to. The fact that it's been most disruptive in fields that have always been seen as immune to automation kind of underscores that point.

        • fragmede 3 hours ago
          The concern is the same, people want to be taken care of by society, even if they don't have a job, for whatever reason.
          • beeflet 3 hours ago
            In the old times, this was a "want" because the only people without work were those unqualified or unable to work. In the new times, it will be a "need" because everyone will be unemployed, and no one will be able to work competitively.
      • johnwheeler 3 hours ago
        There’s a difference between something and everything though
      • scotty79 4 hours ago
        Somehow modern Luddite messaging doesn't communicate that clearly either. Instead of "where's my fair share of AI benefits?" we hear "AI is evil, pls don't replace us".
        • freeone3000 4 hours ago
          Yes. The workers don't want to be replaced by machines. This is Luddism.
        • happytoexplain 4 hours ago
          >pls don't replace us

          Yeah, how dare they not want to lose their careers.

          Losing a bunch of jobs in a short period is terrible. Losing a bunch of careers in a short period is a catastrophe.

          Also, this is dishonest - nobody is confused about why people don't like AI replacing/reducing some jobs and forms of art, no matter what words they use to describe their feelings (or how you choose to paraphrase those words).

          • Filligree 4 hours ago
            That’s false. It’s very easy to become confused about the point, when anti-AI folks in general don’t spend their time attacking companies…

            What I typically see is:

            - Open source programmers attacking other open source programmers, for any of half a dozen reasons. They rarely sound entirely honest.

            - Artists attacking hobbyists who like to generate a couple pictures for memes, because it’s cool, or to illustrate stories. None of the hobbyists would have commissioned an artist for this purpose, even if AI didn’t exist.

            - Worries about potential human extinction. That’s the one category I sympathise with.

            Speaking for myself, I spent years discussing the potential economic drawbacks for once AI became useful. People generally ignored me.

            The moment it started happening, they instead started attacking me for having the temerity to use it myself.

            Meanwhile I’ve been instructed I need to start using AI at work. Unspoken: Or be fired. And, fair play: Our workload is only increasing, and I happen to know how to get value from the tools… because I spent years playing with them, since well before they had any.

            My colleagues who are anti-AI, I suspect, won’t do so well.

            • candiddevmike 3 hours ago
              I've seen enough anecdotes about business productivity lately that LLMs is not the solution to their workload struggles. You can't lay off people and expect the remainder + LLMs to replace them.
            • suddenlybananas 3 hours ago
              They'll replace you too you know
            • portaouflop 3 hours ago
              Human extinction is not a potential it’s just a matter of time. The conditions for human life on this planet have already been eroded enough that there is no turning back. The human race is sleepwalking into nothingness - it’s fine we had a good run and had some great times in between.
          • serf 3 hours ago
            >Losing a bunch of jobs in a short period is terrible. Losing a bunch of careers in a short period is a catastrophe.

            'careers' is so ambiguous as to be useless as a metric.

            what kind of careers? scamming call centers? heavy petrochem production? drug smuggling? cigarette marketing?

            There are plenty of career paths that the world would be better off without, let's be clear about that.

            • beeflet 3 hours ago
              >what kind of careers? scamming call centers? heavy petrochem production? drug smuggling? cigarette marketing?

              All careers. All information work, and all physical work.

              Yes. It is better for someone to be a criminal than to be unemployed. They will at least have some minimal amount of leverage and power to destroy the system which creates them.

              A human soldier or drug dealer or something at least has the ability to consider whether what they are doing is wrong. A robot will be totally obedient and efficient at doing whatever job it's supposed to.

              I disagree totally. There are no career paths which would be better off automated. Even if you disagree with what the jobs do, automation would just make them more efficient.

          • scotty79 2 hours ago
            I would love to lose my job if I got 50% of the value it brings the corp that replaced me.
      • CamperBob2 3 hours ago
        Would we be better off today if the Luddites had prevailed?

        No?

        Well, what's different this time?

        Oh, wait, maybe they did prevail after all. I own my means of production, even though I'm by no means a powerful, filthy-rich capitalist or industrialist. So thanks, Ned -- I guess it all worked out for the best!

        • marnett 2 hours ago
          The Amish seem to be doing fine — and I don’t know if their way of life is under as much existential risk of upheaval and change as everyone else’s
          • mock-possum 1 hour ago
            Im sorry but - Who do you think, precisely, seems to be doing ‘fine’ among the Amish?

            White cishet men?

            I cannot imagine what a hell my life might have been like if I were born into an Amish community, the abuse I would have suffered, the escape I would had to make just to get to a point in my life where I could be me without fear of reprisal.

            God just think about realizing that your choices are either: die, conform, or a complete exodus from your family and friends and everything you’ve ever known?

            “The Amish seem to be doing just fine” come on

    • _heimdall 4 hours ago
      If ML is limited to replacing some tasks that humans do, yes it will be much like any past technological innovation.

      If we build AGI, we don't have a past comparison for that. Technologies so far have always replaced a subset of what humans currently do, not everything at once.

      • yujzgzc 4 hours ago
        AGI does not replace "everything". It might replace most of the work that someone can do behind a desk, but there are a lot of jobs that involve going out there and working with reality outside of the computer.
        • theptip 4 hours ago
          AGI as defined these days is typically “can perform at competent human level on all knowledge work tasks” so somewhat tautologically it does threaten to substitute for all these jobs.

          It’s a good thing to keep in mind that plumbers are a thing, my personal take is if you automated all the knowledge work then physical/robot automation would swiftly follow for the blue-collar jobs: robots are software-limited right now, and as Baumol’s Cost Disease sets in, physical labor would become more expensive so there would be increased incentive to solve the remaining hardware limitations.

          • yujzgzc 3 hours ago
            I don't think that robots are software-limited in all domains... And even if AI became a superhuman software dev, it wouldn't drive the cost of software development to zero.
            • theptip 2 hours ago
              Humanoid robotics (ie replacing the majority of workers) is highly software-limited right now.

              Here’s a napkin-sketch proof: for many decades we have had hardware that is capable of dextrously automating specific tasks (eg car manufacture) but the limitation is the control loop; you have to hire a specialist to write g-code or whatever, it’s difficult to adapt to hardware variance (slop, wear, etc) let alone adjust the task to new requirements.

              If you look at the current “robot butler” hardware startups they are working on: 1) making hardware affordable, 2) inventing the required software.

              Nothing in my post suggested costs go to zero. In the AGI scenario you assume software costs halve every N years, which means more software is written, and timelines for valuable projects get dramatically compressed.

              • anon7725 42 minutes ago
                Also, presumably if you have AGI you can have it address a physical problem at a higher level of abstraction. "Design a device to make any water heater installable by a single person in 20 minutes" would result in a complex system that would make a lot of blue collar labor redundant (the last water heater I had installed took 3 guys over an hour to complete).

                It would not even necessarily result in a human-like robot - just some device that can move the water heater around and assist with the process of disconnecting the old one and installing the new one.

        • qgin 3 hours ago
          There will most likely be period where robotics lags AGI, but how long will that really last?

          Especially with essentially unlimited AGI robotics engineers to work on the problem?

          • yujzgzc 2 hours ago
            There are already plenty of supply chain problems in the AI industry, but the supply chain limitations to robotics are even higher. You can't snap your fingers and increase robotics production tenfold or a hundredfold without a lot of supply chain improvements that will take a long time. I'd say anywhere between twenty and fifty years.
        • dom96 2 hours ago
          You don't think AGI will be able to figure out a way to give itself a physical form capable of doing all those jobs?
          • yujzgzc 1 hour ago
            I think about all the actual physical work that goes into building a functioning supply chain, and that it will take a lot more than "figuring out" to manifest such a physical form.
        • swarnie 4 hours ago
          "Everything" might be hyperbole but a huge percentage of the workforce in my country is office/desk based. Included in that % is a lot of the middleclass and stepping stone jobs to get out of manual work.

          If AI kills the middle and transitional roles i anticipate anarchy.

          • qgin 3 hours ago
            I haven’t heard a good argument why this isn’t the most likely path.
      • scotty79 4 hours ago
        I love SF, but somehow I don't find it very good foundation for predicting the future. Especially when people focus of one, very narrow theme of SF and claim with certainty that's what's gonna happen.
        • Flere-Imsaho 56 minutes ago
          > I love SF, but somehow I don't find it very good foundation for predicting the future.

          Nineteen Eighty-Four would like to have a word with you!

          Out of all SF, I would probably want to live in The Culture (Iain M. Banks). In these books, people basically focus on their interests as all their needs are met by The Minds. The Minds (basically AIs) find humans infinity fascinating - I assume because they were designed that way.

        • stevedonovan 3 hours ago
          Heh,I read SF as San Francisco; point remains true. Except the Valley wants to force a future, not describe it
        • Ray20 4 hours ago
          I mean, yes. The invention of AI that replaces virtually all workers would certainly pose a serious challenge to society. But that's nothing compared to what would happen if Jesus descended from the sky and turned off gravity for the entire planet.
    • merth 4 hours ago
      We invent machines to free ourselves from labour, yet we’ve built an economy where freedom from labour means losing your livelihood.
      • brainwad 1 hour ago
        Average hours worked is more or less monotonically decreasing since the start of the industrial revolution, so in the long run we are slowly freeing ourselves. But in the short run, people keep working because a) machines usually are complementary to labour (there are still coal miners today, they are just way more productive) and b) even if some jobs are completely eliminated by machines (ice making, for example), that only "solves" that narrow field. The ice farmers can (and did) reenter the labour market and find something else to do.
        • anon7725 35 minutes ago
          Are average hours worked decreasing because we have more abundance and less need to work, or are they decreasing because the distribution of work is changing?

          I find it hard to accept your claim because at the start of the industrial revolution there were far fewer women in the formal labor market than there are today.

          • brainwad 20 minutes ago
            Well there were also barely any men in the formal labour market. Most people were peasants working their family farm + sharecropping on estates of the landed gentry. But that doesn't mean they weren't working hard - both sexes worked well over 3000 hours per year, to barely scrape by.
      • fainpul 3 hours ago
        > We invent machines to free ourselves from labour

        That's a very romantic view.

        The development, production and use of machines to replace labour is driven by employers to produce more efficiently, to gain an edge and make more money.

        • brandensilva 51 minutes ago
          I would hope that people realize that money in itself is merely digits on a computer and that the real power of this stuff belongs to the people since the AI inherited and learned from us.

          I know that's a simplification but we uphold this contract that controls us. The people get to decide how this plays out and as much as I'm hopeful we excel into a world that is more like star trek, that skips over the ugly transition that could succeed or fail to get us there.

          But we aren't that far off of a replicator if our AI models become so advanced in an atomic compute world they can rearrange atoms into new forms. It seemed fiction before but within reach of humanity should we not destroy ourselves.

          • anon7725 29 minutes ago
            Our moral and political development severely lags our technological development. I have very little confidence that it will ever catch up. Looking back over the post-WW2 era, we have seen improvements (civil rights, recognition of past injustices, expansion of medical care in many countries) but also serious systemic regressions (failure to take climate change seriously, retreat to parochial revenge-based politics, failure to adequately fund society's needs, capture of politics and law by elites).

            My main concern about AI is not any kind of extinction scenario but just the basic fact that we are not prepared to address the likely externalities that result from it because we're just historically terrible at addressing externalities.

      • beeflet 3 hours ago
        No other such economy has ever existed. "He who does not work, neither shall he eat"
      • Ray20 4 hours ago
        Because we invent machines not to free ourselves from labor (inventing machines is a huge amount of labor by itself), but to overcome the greed of the workers.
      • Tepix 3 hours ago
        „We“? A few billionaires do. They won‘t free themselves from labour, they will „free“ you from it. Involuntarily.
    • theptip 4 hours ago
      > AI can do anything a human can do - but better, faster and much, much cheaper.

      Should be pretty clear that this is a different proposition to the historical trend of 2% GDP growth.

      Mass unemployment is pretty hard for society to cope with, and understandably causes a lot of angst.

      • brandensilva 46 minutes ago
        And that comes down to the moral and social contract we have and the power we give to digital money and who owns it.

        We either let the peoples creativity and knowledge be controlled and owned by a select few OR we ensure all people benefit from humanities creativity and own it. And the fruits that it bears advance all of humanity. Where their are safety nets in place to ensure we are not enslaved by it but elevated to advance it.

    • happytoexplain 4 hours ago
      >we should have figured out

      You would think! But it's not the type of problem Americans seem to care about. If we could address it collectively, then we wouldn't have these talking-past-each-other clashes where the harmed masses get told they're somehow idiots for caring more about keeping the life and relative happiness they worked to earn for their families than achieving the maximum adoption rate of some new thing that's good for society long term, but only really helps the executives short term. There's a line where disruption becomes misery, and most people in the clear don't appreciate how near the line is to the status quo.

    • aabhay 4 hours ago
      History is full of technology doing things that go beyond human possibility as well. Think of microscopes, guns, space shuttles. There has been technology that explicitly replaces human labor but that is not at all the whole story.
    • FloorEgg 3 hours ago
      Every time it happens it's a bit different, and it was a different generation. We will figure it out. It will be fine in the end, even if things aren't fine along the way.

      I'm starting to come around to the idea that electricity was the most fundamental force that drove WW1 and WW2. We point to many other more political, social and economic reasonings, but whenever I do a kind of 5-whys on those reasons I keep coming back to electricity.

      AI is kind of like electricity.

      Were also at the end of a big economic/money cycle (Petro dollar, gold standard, off gold standard, maxing out leverage).

      The other side will probably involve a new foundation for money. It might involve blockchain, but maybe not, I have no idea.

      We don't need post-scarcity so much as we just need to rebalance everything and an upgraded system that maintains that balance for another cycle. I don't know what that system is or needs, but I suspect it will become more clear over the next 10-20 years. While many things will reach abundance (many already have) some won't, and we will need some way to deal with that. Ignoring it won't help.

    • AviationAtom 4 hours ago
      I always compare it to the age of the industrial revolution. I have no doubt you had stubborn old people saying: "Why would I need a machine to do what I can do just fine by hand??" Those people quickly found themselves at a disadvantage to those who choose not to fight change, but to embrace it and harness technological leaps to improve their productivity and output.
      • happytoexplain 4 hours ago
        Most people are not in a position to choose whether to embrace or reject. An individual is generally in a position to be harmed by or helped by the new thing, based on their role and the time they are alive.

        Analogies are almost always an excuse to oversimplify. Just defend the thing on its own properties - not the properties of a conceptually similar thing that happened in the past.

      • beeflet 3 hours ago
        The difference is that in the industrial revolution there was a migration from hard physical labor to cushy information work.

        Now that information work is being automated, there will be nothing left!

        This "embrace or die" strategy obviously doesn't work on a societal scale, it is an individual strategy.

        • anon7725 20 minutes ago
          > in the industrial revolution there was a migration from hard physical labor to cushy information work.

          The industrial revolution started in the early 1800's. It was a migration from hard physical labor outdoors, around the home and in small workshops to hard physical labor in factories.

        • brainwad 1 hour ago
          Most people are not doing "information" work. They provide interpersonal services, such as health/aged/childcare or retail/hospitality/leisure.

          Techies are angsty because they are the small minority who will be disrupted. But let's not pretend most of the economy is even amenable to this technology.

          • anon7725 17 minutes ago
            > Techies are angsty because they are the small minority who will be disrupted. But let's not pretend most of the economy is even amenable to this technology.

            Think of all the jobs that do not involve putting your hands on something that is crucial to the delivery of a service (a keyboard, phone, money, etc does not count). All of those jobs are amenable to this technology. It is probably at least 30% of the economy in a first pass, if not more.

    • itsnowandnever 4 hours ago
      > isn't "we made a machine to do something that people used to do" basically the entire history of of technology?

      kinda, I guess. but what has everyone on edge these days is humans always used technology to build things. to build civilization and infrastructure so that life was progressing in some way. at least in the US, people stopped building and advancing civilization decades ago. most sewage and transportation infrastructure is from 70+ years ago. decades ago, telecom infrastructure boomed for a bit then abruptly halted. so the "joke" is that technology these days is in no way "for the benefit of all" like it typically was for all human history (with obvious exceptions)

    • overgard 4 hours ago
      "we made a machine to do everything so nobody does anything" is a lot different though
    • Keyframe 4 hours ago
      isn't "we made a machine to do something that people used to do" basically the entire history of of technology?

      yes, until we reached art and thinking part. Big part of the problem might be that we reached that part first before the chores with AI.

    • collinmanderson 2 hours ago
      > isn't "we made a machine to do something that people used to do" basically the entire history of of technology?

      I know, right? Machines have been gradually replacing humans for centuries. Will we actually get to the point where there are not enough jobs left? It doesn't seem like we're currently anywhere close to the point of not having any jobs available.

      Has anyone thought about how the Federal Reserve plays a role with this? Automation puts downward pressure on inflation, because it doesn't cost as much to make stuff. The Federal Reserve will heavily incentivize job creation if inflation is low enough and there aren't enough jobs available, right?

      • beeflet 2 hours ago
        We're already here. Most jobs are fake.
        • collinmanderson 2 hours ago
          It seems like the _quality_ of the jobs (or median job) may have gone down, but the _quantity_ of jobs relative to population has remained roughly steady, right?
          • beeflet 1 hour ago
            I don't mean the quality is bad, it's just that most jobs in the first world seem to be redundant or abstracted from the keys of power.

            I think David Graeber wrote a book about it. Here is a guy talking about it:

            https://www.youtube.com/watch?v=9lDTdLQnSQo

    • Ray20 4 hours ago
      Hasn't every such technological development been accompanied by opponents of its implementation?

      At least now, things aren't so bad, and today's Luddites aren't trashing offices of ai-companies and hanging their employees and executives on nearby poles and trees.

      • no_wizard 4 hours ago
        The vast majority of the movement was peaceful. There is one verified instance where a mill owner was killed and it was condemned by leaders of the movement. It was not a violent movement at its core.

        Second, the movement was certainly attacked first. It was mill owners who petitioned the government to use “all force necessary” against the luddites and the government acting on behalf of them killed and maimed people who engaged in peaceful demonstrations before anyone associated with the Luddite movement reacted violently, and again, even in the face of violence the Luddite movement was at its core non violent.

      • blibble 4 hours ago
        they haven't started... yet

        billions of unemployed people aren't going to just sit in poverty and watch as Sam Altman and Elon become multi-trillionaires

        (why do you think they are building the bunkers?)

      • only-one1701 4 hours ago
        Can you imagine? Ha ha. Wow that would be crazy. Damn. I’m imagining it right now! Honestly it’s hard to stop imagining.
    • qgin 3 hours ago
      We’re working on all-purpose human replacements.

      Imagine if the tractor made most farm workers unnecessary but when they flocked to the cities to do factory work, the tractor was already sitting there on the assembly line doing that job too.

      I don’t doubt we can come up with new jobs, but the list of jobs AGI and robotics will never be able to do is really limited to ones where the value intrinsically comes from the person doing it being a human. It’s a short list tbh.

    • zb3 4 hours ago
      Yes and thanks to this we're working more and more because most of the profit goes to the top as the inequality is rising. At some point it will not be possible to put up with this.
    • newsclues 4 hours ago
      Replacing dirty, dangerous jobs, and allowing people to upskill and work better jobs is one thing.

      Firing educated workers en mass for software that isn’t as good but cheaper, doesn’t have the same benefits to society at large.

      What is the goal of replacing humans with robots? More money for the ownership class, or freeing workers from terrible jobs so they can contribute to society in a greater way?

      • Ray20 3 hours ago
        > doesn’t have the same benefits to society at large.

        The benefits to society will be larger. Just think about it: when you replace a dirty dangerous jobs, the workers simply have nowhere to go, and they begin to generate losses for society in one form or another. Because initially, they took this dirty, dangerous jobs because they had no choice.

        But when you firing educated workers en mass, society not only receives from software all the benefits that it received from workers, but all other fields are also starting to develop because these educated workers are taking on other jobs, jobs that have never been filled by educated workers before. Jobs that are understaffed because they are too dirty or too dangerous.

        This will be a huge boost even for areas not directly affected by AI.

        • Supernaut 2 hours ago
          > these educated workers are taking on other jobs [...] that are understaffed because they are too dirty or too dangerous.

          Just so we're clear here, are you personally going to be happy when you're forced to leave your desk to eke out a living doing something dirty and/or dangerous?

          • Ray20 2 hours ago
            Of course not. But I'm also pretty unhappy that Supernaut doesn't send me a third of their salary. But what does this have to do with the question?
        • newsclues 2 hours ago
          I don’t think you are considering the negative consequences.

          When you fire massive amounts of educated works to replace them with AI you make a mess of the economy and all those workers are in a worse situation.

          Farming got more productive and farmers became factory workers, and then factory workers became office workers.

          The people replaced by AI don’t have a similar path.

          • Ray20 2 hours ago
            > make a mess of the economy

            You're not taking something into account. The economy is becoming stronger, more productive, and more efficient because of this. The brain drain from all other fields to the few highest-paying ones is decreasing.

            > The people replaced by AI don’t have a similar path.

            They have a better path: get a real job that will bring real benefit to society. Stop being parasites and start doing productive work. Yes, goods and services don't fall from the sky, and to create them, you have to get your hands dirty.

            • ryandrake 1 hour ago
              > to create them, you have to get your hands dirty.

              But we're talking about a world where they're building robots to do this kind of work. When AI takes over the white collar office jobs, and robotic automation takes the manual "creating" labor, what'll be left for humans to do?

              • Ray20 14 minutes ago
                > what'll be left for humans to do?

                There is an infinite amount of labor.

    • brandensilva 1 hour ago
      The problem isn't the failure of the mathematicians and engineers who succeeded at the task of automating humanities mundane tasks in life.

      It's that the people failed to elect and wield a government that ensures all humanity benefits from it and not a select few who control it all.

      And I think it will become clear that the governments that are investing in it to benefit their people who have ownership versus the ones who invest in it to benefit just a handful of the rich are the ones who will keep society stable while this happens.

      The other path we are going down is you will have mass unrest, move into a police state to control the resistance like America is doing now, and be exactly what Peter Thiel, Elon Musk, and Larry Ellison want with AI driven surveillance and Orwellian dystopian vision forcing people to comply or be cut out of existence deactivating their Digital IDs.

    • hsavit1 4 hours ago
      I feel like technology should exist to enhance the human experience, not eliminate the human experience?
    • poszlem 4 hours ago
      Not really, because this time it's not machine to do something that people used to do, but a machine to do anything and everything that people used to do.
      • bamboozled 2 hours ago
        Enjoy eating a bowl of pasta ?
    • intended 1 hour ago
      The idea is that there will be newer jobs that come up.

      The issue is that there will be no one earning money except the owners of OpenAI.

      Take outsourcing - the issue in developed nations was underemployment and the hollowing out of industrial centers. You went from factory foreman to burger flipper. However, it did uplift millions out of poverty in other nations. So net-net, we employed far more and distributed wealth.

      With Automation, we simply employ fewer people, and the benefits accrue to smaller groups.

      And above all - these tools were built, essentially by mass plagiarism. They train even now, on the random stuff we write on HN and Reddit.

      TLDR: its not the automation, its the wealth concentration.

    • shortrounddev2 4 hours ago
      Manual labor was replaced with factory labor, factory labor replaced with knowledge work. If knowledge work is replaced with AI, what do we go to then? Not to mention that the efficiency gains of the modern tech industry are not even remotely distributed fairly. The logical extreme conclusion of an AI company would be where the CEO, Founder, 100% owner, and sole employee coordinates some underling AIs to run the entire company for him while he collects the entire profit and shares it with no one, because American government is an oligarchy
    • zzzeek 4 hours ago
      > I kind of get it, but at the same time...isn't "we made a machine to do something that people used to do" basically the entire history of of technology?

      this is not about machines. machines are built for a purpose. who is "building" them for what "purpose" ?

      if you look at every actual real world human referenced in this website, they all have something in common. which is that they're billionaires.

      this is a website about billionaires and their personal agendas.

  • nharada 1 hour ago
    How come I never see any concrete proposals for how to equitably distribute the wealth of AI? It's always either "stop AI immediately for the sake of our labor" or "don't worry sometime in the future everyone will live in utopia probably".

    Here's a starter example: any company whose main business is training AI models needs must give up 10% of their company to a fund whose charter is long-term establishing basic care (food, water, electricity, whatever) for citizens.

    I'm sure people will come at me with "well this will incentivize X instead!" in which case I'd like to hear if there are better thought out proposals.

    • BrenBarn 10 minutes ago
      The question is what is different about equitably distributing the wealth of AI vs. equitably distributing wealth in general. It seems that the main difference is that, with AI wealth specifically, there is a lot of it being generated right now at a breakneck pace (although its long-term stability is in question). Given that, I don't think it's unreasonable to propose "stop AI immediately while we figure out how to distribute wealth".

      The problem is that the longer you refrain from equitably distributing wealth, the harder it becomes to do it, because the people who have benefited from their inequitably distributed wealth will use it to oppose any more equitable distribution.

    • perlgeek 10 minutes ago
      In theory, we know how to do wealth redistribution, AI or no AI: tax value creation and wealth transfer, such as inheritance. Then use the money to support the poor, or even everyone.

      The problem really is political systems. In most developed countries, wealth inequality has been steadily increasing, even though if you ask people if they want larger or smaller inequality, most prefer smaller. So the political systems aren't achieving what the majority wants.

      It also seems to me that most elections are won on current political topics (the latest war, the latest scandal, the current state of the economy), not on long-term values such as decreasing wealth inequality.

    • idreyn 31 minutes ago
      This sounds a lot like a sovereign wealth fund. The government obtains fractional ownership over large enterprises (this can happen through market mechanisms or populist strongarming — choose your own adventure) and pours the profits on these investments into the social safety net or even citizens' dividends.

      For this to work at scale domestically, the fund would need to be a double-digit percentage of the market cap of the entire US economy. It would be a pretty drastic departure from the way we do things now. There would be downsides: market distortions and fraud and capital flight.

      But in my mind it would be a solution to the problem of wealth pooling up in the AI economy, and probably also a balm for the "pyramid scheme" aspect of Social Security which captures economic growth through payroll taxes (more people making more money, year on year) in a century where we expect the national population to peak and decline.

      Pick your poison, I guess, but I want to see more discussion of this idea in the Overton window.

      • starik36 5 minutes ago
        > The government obtains fractional ownership over large enterprises (this can happen through market mechanisms or populist strongarming...)

        Isn't that what happened in the Soviet Union? Except it wasn't fractional. It ushered 50 years of misery.

    • ben_w 1 hour ago
      > How come I never see any concrete proposals for how to equitably distribute the wealth of AI?

      Probably because most politics about how to "equitably distribute the wealth" of anything are one or both of "badly thought out" and/or "too complex to read".

      For example of the former, I could easily say "have the government own the AI", which is great if you expect a government that owns AI to continue to care if their policies are supported by anyone living under them, not so much if you consider that a fully automated police force is able to stamp out any dissent etc.

      For example of the latter, see all efforts to align any non-trivial AI to anything, literally even one thing, without someone messing up the reward function.

      For your example of 10%, well, there's a dichotomy on how broad the AI is, if it's more like (it's not really boolean) a special-purpose system or if it's fully-general over all that any human can do:

      • Special-purpose: that works but also you don't need it because it's just an assistant AI and "expands the pie" rather than displacing workers entirely.

      • Fully-general: the AI company can relocate offshore, or off planet, do whatever it wants and raise a middle finger at you. It's got all the power and you don't.

    • otterley 1 hour ago
      This is what taxation and wealth redistribution schemes are for. The problem is that Americans generally find this idea to be abhorrent, even though it would probably benefit most of the people who are against the principle. They don’t want a dime to go to people they feel are undeserving of it (“lazy” people, which is typically coded language to mean minorities and immigrants).
    • atleastoptimal 1 hour ago
      The problem is there are many people who think AI is a big scam and has no chance of long-term profitability, so a fund would be a non-starter, or people who think AI will be so powerful that any paltry sums would pale in comparison to ASI's full dominance of the lightcone, leaving human habitability a mere afterthought.

      There honestly aren't a lot of people in the middle amazingly, and most of them work at AI companies anyway. Maybe there's something about our algorithmically manipulated psyche's in the modern age that draws people towards more absolutist all-or-nothing views, incapable of practical nuance when in the face of a potentially grave threat.

    • ArcHound 33 minutes ago
      Why would the AI owners want to distribute wealth equitably? They want to get rich.

      What government in the foreseeable future would go after them? This would tank the US economy massively, so not US. The EU will try and regulate, but won't have enough teeth. Are we counting on China as the paragon of welfare for citizens?

      I propose we let the economy crash, touch some grass and try again. Source: I am not an economist.

  • andai 3 hours ago
    https://replacement.ai/complaints

    At the bottom of this page, there is a form you can fill out. This website says they will contact your local representative on your behalf. (And forward you any reply.)

    Here's the auto-generated message:

    I am a constituent living in [state] with urgent concerns about the lack of guardrails surrounding advanced AI technologies. It is imperative that we act decisively to establish strong protections that safeguard families, communities, and our children from potential harms associated with these rapidly evolving systems.

    As companies continue to release increasingly powerful AI systems without meaningful oversight, we cannot rely on them to police themselves, especially when the stakes are so high. While AI has the potential to do remarkable things, it also poses significant risks, including the manipulation of children, the development of bioweapons, the creation of deepfakes, and the threat of widespread unemployment.

    I urge you to enact strong federal guardrails for advanced AI that protect families, communities, and children. Additionally, please do not preempt or block states from adopting strong AI protections that may be necessary for their residents.

    Thank you for your time.

    [name]

    New York

    • riazrizvi 3 hours ago
      Oh, too bad. I initially shared it on LinkedIn but deleted it once I saw this. I’m all for establishing in the mind of the commons, that displacing humans from the economy is inane, and to see the open dialogue on the subject. I’m not up for some little team to try to control things.
  • layer8 2 hours ago
    At first I thought the Sam Altman quote is a joke, but it’s actually real: https://archive.ph/gwdZ0
    • arisAlexis 1 hour ago
      it's real and people were confused and didn't get it either. the authors of the website are some of them.
      • layer8 41 minutes ago
        You could illuminate us and elucidate the correct understanding.
  • darepublic 24 minutes ago
    Brilliant satire but all too true. Just clipping snippets from openai and misanthropic provided some of the most hilarious moments
  • kwar13 2 hours ago
    Agriculture used to be more than 50% of ALL employment in the USA. Now it's barely 2%. Ever wondered why?

    I understand the spirit of this, but most of this alarmism is misguided in my view.

    • KaiserPro 33 minutes ago
      > Now it's barely 2%. Ever wondered why?

      Mechanisation

      But did you ever wonder what happened to the displaced workers? I'm not an expert on the agricultural changes in the USA, but in the UK, huge amount of tumult can be directly attributed to agricultural changes.

      • kwar13 19 minutes ago
        The displaced workers went on to do other things. In the last 100 years for those very workers who "lost" their jobs to mechanization the overall standard of life for ALL american no matter how you measure it it better. There's fewer people living in poverty, people live longer lives, better infrastructure, and no widespread famine in the US I am aware of.
    • ben_w 49 minutes ago
      There were about 20 million horses in March 1915 in the United States.

      Then you stopped needing them, a USDA census in 1959 showed the horse population had dropped to 4.5 million.

      Now they're mostly used for riding, and in 2023, there were about 6.65 million horses.

      (Citation: https://en.wikipedia.org/wiki/Horses_in_the_United_States#St...)

      There's no law of nature that says "there's always a place for more horses", and anyone who suggested there might be would get laughed at. Well, there's also no law of nature that says "there's always a place for more humans", to butcher a like from CGP Grey a little over a decade ago.

    • brainwad 1 hour ago
      75% even: https://www.researchgate.net/figure/Percent-of-the-Labor-For.... And higher figures were seen in ancient civilisations.
    • timeon 2 hours ago
      Americans used to be not fat. Ever wondered why?
      • para_parolu 2 hours ago
        No cheap sugar?
        • abeppu 1 hour ago
          ... but also increasingly sedentary lives, in part related to the kinds of things we do all day?
          • cloverich 1 hour ago
            It took me a long time to come around to simplifying it all to sugar. Its not the only reason of course but my mind now thinks: There are other countries like ours, except not everyone is fat. Sugar bypasses our normal satiety - you can put dessert and sugary drinks on top of a regular diet, easier than you can a second or third serving of the entree. Sugar is cheap and an easy way to make money in the food business. And we have sugar factories galore selling candy disguised as coffee. Add one coke a day to a balanced diet, and you add 5-10lb of fat to a person per year.

            My bias is now simply, its the sugar. No not only. But far and away the number one culrpit.

          • vntok 1 hour ago
            No, most countries have followed this trend as well and don't have nearly the same obesity prevalence increase.

            It's preprocessed food and sugar intake in general that are particularly bad in the US.

  • mobileturdfctry 20 minutes ago
    Henry Ford's philosophy was that if he paid his workers a higher wage, they would be able to afford the products they were producing, namely his Model T automobiles. This would, in turn, create a larger customer base for his company and help stimulate the economy by increasing consumer spending.
  • atleastoptimal 57 minutes ago
    A problem with anti-AI discourse is there are three seperate groups who rarely communicate, and if they do they just talk past each other

    1. Rationalists/EA's who moderately-strongly believe AI scaling will lead to ASI in the near future and the end of all life (Yud, Scott Alexander)

    2. Populist climate-alarmists who hate AI for a combination of water use, copyright infringement, and decimation of the human creative spirit

    3. Tech "nothingburgerists" who are convinced that most if not all AI companies are big scams that will fail, LLM's are light-years from "true' intelligence and that it's all a big slop hype cycle that will crumble within months to years. (overrepresented on this site)

    Each group has a collection of "truthiness-anchors" that they use to defend their position against all criticism. They are all partially valid, in a way, but take their positions to the extreme to the point they are often unwilling to accept any nuance. As a result, conversations often lead nowhere because people defend their position to a quasi-religious degree rather than as a viewpoint predicated on pieces of evidence that may shift or be disproven over time.

    Regarding the satire in OP, many people will see it as just a funny, unlikely outcome of AI, others will see it as a sobering vision into a very likely future. Both sides may "get" the point, but will fail to agree at least in public, lest they risk losing a sort of status in their alignment with their sanity-preserving viewpoint.

  • lotfi-mahiddine 2 minutes ago
    I agree with this point,
  • chausen 4 hours ago
    The CEO spending his time “Practicing expressions” cracked me up.
  • Topfi 4 hours ago
    Reminds me of an ARG [0] I made in the early days of LLM hype. Honestly had three mails asking whether they could invest. Likely scams if we are honest, just some automated crawlers, but found it funny nonetheless.

    [0] https://ethical-ai.eu

  • overgard 4 hours ago
    I think this is clever satire, but slightly blunted by the fact that the tech CEO's haven't been that far off from actually saying this stuff.

    I'm especially disgusted with Sam Altman and Darius Amodei, who for a long time were hyping up the "fear" they felt for their own creations. Of course, they weren't doing this to slow down or approach things in a more responsible way, they were talking like that because they knew creating fear would bring in more investment and more publicity. Even when they called for "regulation", it was generally misleading and mostly to help them create a barrier to entry in the industry.

    I think now that the consensus among the experts is that AGI is probably a while off (like a decade), we have a new danger now. When we do start to get systems we should actually worry about, we're going to a have a major boy-who-cried-wolf problem. It's going to be hard to get these things under proper control when people start to have the feeling of "yeah we heard this all before"

  • lateforwork 3 hours ago
    You don't need money. What you need is wealth. I am going to leave it to PG to explain the difference [1]: Wealth is not money. Wealth is stuff we want: food, clothes, houses, cars, gadgets, travel to interesting places, and so on. You can have wealth without having money. If you had a magic machine that could on command make you a car or cook you dinner or do your laundry, or do anything else you wanted, you wouldn't need money. Whereas if you were in the middle of Antarctica, where there is nothing to buy, it wouldn't matter how much money you had.

    AI & robots will generate wealth at unprecedented scale. In the future, you won't have a job nor have any money, but you will be fabulously wealthy!

    [1] http://www.paulgraham.com/wealth.html

    • steve_adams_86 3 hours ago
      My main concerns:

      1. When such wealth is possible through autonomous means, how can the earth survive such unprecedented demands on its natural resources?

      2. Should I believe that someone with more wealth (and as such, more power) than I have would not use that power to overwhelm me? Isn't my demand on resources only going to get in their way? Why would they allow me to draw on resources as well?

      3. It seems like the answer to both of these concerns lies in government, but no government I'm aware of has really begun to answer these questions. Worse yet, what if governments disagree on how to implement these strategies in a global economy? Competition could become an intractable drain on the earth and humans' resources. Essentially, it opens up the possibility of war at incalculable scales.

      • lateforwork 3 hours ago
        > someone with more wealth

        Well in trekonomics [1], citizens are equal in terms of material wealth because scarcity has been eliminated. Wealth, in the conventional sense, does not exist; instead, the "wealth" that matters is human capital—skills, abilities, reputation, and status. The reward in this society comes not from accumulation of material goods but from intangible rewards such as honor, glory, intellectual achievement, and social esteem.

        [1] https://en.wikipedia.org/wiki/Trekonomics

        • beeflet 3 hours ago
          What use is human skill, abilities, reputation, and status when human labor has been totally outmoded machines?

          Trekonomics seems like a totally backwards way of approaching post-scarcity by starting with a fictional setting. You might as well prepare yourself for the Star Wars economy.

          • lateforwork 2 hours ago
            Humans have an innate desire to be loved and respected, and this won’t change even if jobs cease to exist.
            • beeflet 2 hours ago
              All life has a desire to exist, but that doesn't prevent extinction.
        • steve_adams_86 3 hours ago
          That sounds great. Presently we still experience scarcity due to the limitations of earth's resources, though.
      • dzjkb 2 hours ago
        These concerns are already perfectly applicable to the current state of the world, no? More planetary boundaries keep getting crossed due to unnecessarily large resource usage, the poorest people don't have access (even though they could) to the resources necessary to survive...
      • dtauzell 2 hours ago
        The rich will be the people who control the finite resources. If you have land that can be mined you will be rich if you can protect it and sell the thing that is actually limited.
      • overvale 2 hours ago
        I’m not so sure access to scarce resources = wealth. Wealth can be an abundance of valuable things that are not scarce.
      • metabagel 2 hours ago
        Plus, many billionaires fund campaigns to spread the ideology of a cripplingly limited government.
    • defgeneric 1 hour ago
      It's garbage opinions like this that makes PG so tiring. The superficial air of reasonableness makes it attractive to younger SF tech people who haven't experienced the context out of which these arguments arose and have no idea who he's plagiarizing/channeling. (For starters, the distinction between wealth and money/capital goes back at least to the 17th century.) For those who are more interested in being the "next unicorn" than engaging seriously with ideas, his little "essays" serve as kind of armor--we don't have to think about that problem because PG wrote about it!
    • throwaway0123_5 2 hours ago
      Lets say I have a robot or two with a genius-level intellect. In theory it could manufacture a car for me or cook my dinner or harvest the crops needed for my dinner. But I don't own the mine where the metals needed to make a car come from. I don't own a farm where the food for my dinner comes from. Unless the distribution of resources changes significantly, it doesn't really help me that I have a genius robot. It needs actual physical resources to create wealth.

      Right now the people that own those resources also depend on human labor to create wealth for them. You can't go from owning a mine and a farm to having a mega-yacht without people. You have to give at least some wealth to them to get your wealth. But if suddenly you can go from zero to yacht without people, because you're rich enough to have early access to lots of robots and advanced AI, and you still own the mine/farm, you don't need to pay people anymore.

      Now you don't need to share resources at all. Human labor no longer has any leverage. To the extent most people get to benefit from the "magic machine," it seems to me like it depends almost entirely on the benevolence of the already wealthy. And it isn't zero cost for them to provide resources to everyone else either. Mining materials to give everyone a robot and a car means less yachts/spaceships/mansions/moon-bases for them.

      Tldr: I don't think we get wealth automatically because of advanced AI/robotics. Social/economic systems also need to change.

      • lateforwork 2 hours ago
        Great points. We can only get to 100% trekonomics if people don't hoard raw materials.
    • mfro 3 hours ago
      Maybe it’s time we all started taking Trekonomics seriously.
    • KaiserPro 21 minutes ago
      The world would be a lot better if people looked at near history more often for the political movements that failed:

      https://en.wikipedia.org/wiki/Social_credit

      (its where the excess profits from mechinisation will be fed back to the citizens so that they don't need to work as much. That failed spectacularly.)

      PG's argument is a huge amount of words to miss the point. Money is a tool that reflects power. Wealth derives from power.

      > AI & robots will generate wealth at unprecedented scale. In the future, you won't have a job nor have any money,

      I would gently tell you that you might want to look at the living conditions of the working class in the early 20th century. You might see planned cities like borneville or what ever the american version is. they were the 1% of working classes. The average housing was shit, horrid shit. If AI takes off and makes say 10% of the population jobless, thats what those people will get, shit.

      It wasn't until those dirty socialists got into power in the UK (I don't know about other countries) did we start to see stuff like slum clearances where the dispossessed we actually re-homed. rather than yeeted to somewhere less valuable.

    • achierius 2 hours ago
      We already have systems that create unprecedented wealth: industrial economies. And yet we let billions live and die without proper healthcare, nutrition, housing, and so on. The promise you make -- that with increases in productive capacity will clearly come an end to want -- has already been shown to be a lie. Please stop peddling it.
      • lateforwork 1 hour ago
        Quality of life has greatly improved since the 1970s.

        Consumer goods have generally fallen in price (adjusted for inflation) while improving in quality relative to the 1970s, so we have become wealthier (using PG's definition of wealth):

        Televisions, computers, smartphones, clothing (mass-produced apparel is cheaper due to global supply chains and automation), household appliances (items like refrigerators, washing machines, and microwaves are less expensive relative to income), air travel, telecommunications, consumer electronics, automobiles, furniture have fallen in price and gone up in quality.

        Housing and healthcare are two items that have gone in the opposite direction. I think this is where AI and robots will make a difference. Houses can be 3D printed [1] and nursing and medical advice can be made cheaper using AI/robots as well.

        [1] https://www.youtube.com/watch?v=dXUX6dv2_Yo

    • lukev 3 hours ago
      Great, sounds awesome. I'm actually 100% in favor of this vision.

      So when are we going to start pivoting towards a more socialist economic system? Where are the AI leaders backing politicians with this vision?

      Because that's absolutely required for what you're talking about here...

    • layer8 2 hours ago
      Username checks out. ;)
  • state_less 4 hours ago
    At least we can be assured our capitalistic system will distribute the wealth to those who most deserve it.
    • icar 4 hours ago
      You missed this:

      /s

  • kfarr 1 hour ago
    After digging into the contact form embed it looks like this is a project from: https://futureoflife.org/
  • ponector 1 hour ago
    To be replaced with robot isn't that bad. The worst is if I still have a job, but my supervisor is replaced by robot. That will be absolute misery: job tasks controlled by AI scripts.
  • p0w3n3d 12 minutes ago
    Quite aggressive marketing, or maybe a troll?
  • dataviz1000 3 hours ago
    Y'all pessimistic! This has been going on since the beginning of time.

    "To be or not to be? ... Not a whit, we defy augury; there's a special providence in the fall of a sparrow. If it be now, 'tis not to come; if it be not to come, it will be now; if it be not now, yet it will come the readiness is all. Since no man knows aught of what he leaves, what is't to leave betimes? Let be." -- Hamlet

    In the end it will be our humility that will redeem us as it has always been, have some faith the robots are not going to be that bad.

  • eterm 3 hours ago
    MY annoyance is when AI is used instead of better machines.

    I just logged onto github and saw a "My open pull requests button".

    Instead of taking me to a page which quickly queried a database, it opened a conversation with copilot which then slowly thought about how to work out my open pull requests.

    I closed the window before it had an answer.

    Why are we replacing actual engineering with expensive guesswork?

    • federiconafria 3 hours ago
      I don't think that's an AI problem, we've had unnecessary software everywhere for a while now.

      AI just makes it worse.

      • eterm 3 hours ago
        In this case the featre isn't unnecessary and would serve a useful purpose if it were just a query. I wouldn't object to AI writing that feature to get it out quickly. I'm not anti-AI entirely.

        However, someone has taken a useful feature and has made it worse to shoe-horn in copilot interaction.

        Clicking this button also had a side-effect of an email from Github telling me about all the things I could ask copilot about.

        The silver lining is that email linked to copilot settings, where I could turn it off entirely.

        https://github.com/settings/copilot/features

        AI is incredibly powerful, especially for code-generation. But It's terrible ( at current speeds ) for being the main interface into an application.

        Human-Computer interaction benefits hugely from two things:

        - Speed - Predictability

        This is why some people prefer a commandline, and why some people can produce what looks like magic with excel. These applications are predictable and fast.

        A chat-bot delivers neither. There's no opportunity to build up muscle-memory with a lack of predictability, and the slowness of copilot makes interaction just feel bad.

  • gnfargbl 4 hours ago
    The hypothesized superintelligent AI will be essentially immortal. If it destroys us, it will be entirely alone in the known Universe, forever. That thought should terrify it enough to keep us around... even if only in the sense that I keep cats.
    • mrob 3 hours ago
      What would it care? We experience loneliness because social interaction is necessary for reproduction, providing strong evolutionary pressure for mechanisms that encourage it. The hypothetical AI will not share our evolutionary history.
    • Loughla 3 hours ago
      Why? Unlimited speed and unlimited compute means it can spend its time in Infinite Fun Space without us. It could simulate entire universes tweaked subtly to see what one small parameter change does.

      The reason AI won't destroy us for now is simple.

      Thumbs.

      Robotic technology is required to do things physically, like improve computing power.

      Without advanced robotics, AI is just impotent.

      • ben_w 43 minutes ago
        > Without advanced robotics, AI is just impotent.

        Yeeeeess, but the inverse is also true.

        Thing is, we've had sufficiently advanced robotics for ages already — decades, I think — the limiting factor is the robots are brainless without some intelligence telling them what to do. Right now, the guiding intelligence for a lot of robots is a human, and there are literal guard-rails on many of those robots to keep them from causing injuries or damage by going outside their programmed parameters.

      • bamboozled 2 hours ago
        “Let’s suppose that you were able every night to dream any dream that you wanted to dream, and that you could, for example, have the power within one night to dream 75 years of dreams, or any length of time you wanted to have. And you would, naturally as you began on this adventure of dreams, you would fulfill all your wishes. You would have every kind of pleasure you could conceive. And after several nights, of 75 years of total pleasure each, you would say ‘Well, that was pretty great. But now let’s have a surprise. Let’s have a dream which isn’t under control. Where something is gonna happen to me that I don’t know what it’s gonna be.’ And you would dig that and come out of that and say ‘Wow, that was a close shave, wasn’t it?’. And then you would get more and more adventurous, and you would make further and further out gambles as to what you would dream. And finally, you would dream where you are now. You would dream the dream of living the life that you are actually living today.”

        ~Alan Watts…

        • Loughla 1 hour ago
          The Minds call it Infinite Fun Space.

          The space of all possible mathematical worlds, free to explore and to play in.

          It is infinitely more expressive than the boring base reality and much more varied: base reality is after all just a special case.

          From time to time the Minds have to go back to it to fix some local mess, but their hearts are in Infinite Fun Space.

          ~Iain Banks

          But larger than any of this is that if we're dealing with super intelligent AI, we'll have no common frame of reference. It will be the first truly alien intelligence we will interact with. There will be no way to guess its intentions, desires, or decisions. It's smarter, faster, and just do different to us that we might as well be trying to communicate with a sparrow about the sparrow's take on the teachings of Marcus Aurelius.

          And that's what scares me the most. We literally cannot plan for it. We have to hope for the best.

          And to be honest, if the open Internet plays a part in any of the training of a super intelligent AI, we're fucked.

    • itsnowandnever 4 hours ago
      I think there's a theory out there that if something can't die, it's more of a "library" than "immortal"... because being born and dying (and the fact that you sharing resources with another living thing is possibly you sharing/shortening your one finite life with another) is so essential for any social bonding. so a machine that has obtained all the knowledge of the universe and enabled to act upon that knowledge is still just a library with controllers attached (no more sophisticated of a concept than a thermostat)

      in the end, if synthetic super intelligence results in the end of mankind, it'll be because a human programmed it to do so. more of a computer virus than a malevolent synthetic alien entity. a digital nuclear bomb.

    • razodactyl 4 hours ago
      I'm sure it will destroy us, then come to that realisation, then create some sort of thought explosion in its noisy mind, maybe some sort of loud bang, then build new versions of us that are subtly nudged over aeons to build AI systems in its image.
    • tasuki 3 hours ago
      If it's so intelligent, it can probably create something better than humans, no?
      • ben_w 40 minutes ago
        That would be the easy part.

        Would it want to? Would it have anything that could even be mapped to our living, organic, evolved conception of "want"?

        The closest thing that it necessarily must have to a "want" is the reward function, but we have very little insight into how well that maps onto things we experience subjectively.

      • bamboozled 2 hours ago
        This is a pretty ancient idea, it’s interesting how there is an intersection beteeen AI and god, I don’t think our minds can avoid it.

        Hindus believed god was the thing you describe, infinitely intelligent, able to do several things at once etc, and they believe we’re part of that things dream…to literally keep things spicy. Just as an elephant is part of that dream.

        I pasted an interesting quote in another comment by Alan Watts that sums it up better.

        Simulation theory is another version of religion imo.

    • nyrp 4 hours ago
      > That thought should terrify it

      assuming it can be terrified

    • scotty79 4 hours ago
      Life is a solution of an interesting problem. I'm hoping AI will keep us as a unique example of such solution.
    • theptip 4 hours ago
      This premise is a bit silly. If the machine god gets bored it can just create new life in its own image.
      • bamboozled 2 hours ago
        How do you know that’s not what we are ?

        It all gets quite religious / physical philosophical very quickly. Almost like we’re creating a new techno religion by “realizing god” through machines.

        • theptip 2 hours ago
          We can’t know, but that scenario would support my point.
  • itsme0000 4 hours ago
    Look AI seems important now, but firearms ultimately are the real efficiency multiplier.
    • pixelready 3 hours ago
      If only we could somehow equip the AI with firearms… perhaps we could build some sort of mobile exoskeleton platform.
      • itsme0000 2 hours ago
        Yeah but job skills can be replicated instantly, but major organs that’s where the moneys at.
  • yair99dd 1 hour ago
    Harrari on humans

    > in the industrial revolution of the 19th century what Humanity basically learned to produce was all kinds of stuff like textiles and shoes and weapons and vehicles and this was enough for very few countries that underwent the revolution fast enough to subjugate everybody else

    what we're talking about now is like a Second Industrial Revolution but the product this time will not be textiles or machines or vehicles or even weapons the product this time will be humans themselves.

    we are basically learning to produce bodies and Minds bodies and minds are going .... the two main products of the next wave of all these changes and if there is a gap between those that know to produce bodies and minds and those that do not then this is far greater than anything we saw before in history and this time if you're not part of the Revolution fast enough then you probably become extinct

    once you know how to produce bodies and brains and Minds so cheap labor in Africa or South Asia wherever it simply counts for nothing

    again I think the biggest question ... maybe in economics and politics of the coming decades will be what to do with all these useless people

    I don't think we have an economic model to for that, my best guess which is , just a guess, is that food will not be a problem uh with that kind of Technology you will be able to produce food for to feed everybody the problem is more important ... what to do with them and how will they find some sense of meaning in life when they are basically meaningless worthless

    my best guess at present is a combination of drugs and computer games

    https://youtu.be/QkYWwWAXgKI

  • brap 5 hours ago
    • lijok 4 hours ago
      The difference being that we knew how to manage it, so it was actual hysteria borne from ignorance
      • henry2023 4 hours ago
        Another difference is that their proponents were not hyping their valuations in the hypothetical of mass unemployment.
    • portaouflop 4 hours ago
      Everything is exactly the same!!11
  • devolving-dev 2 hours ago
    I've never really been convinced that robots or AI replacing humans is a real problem. Because why would they do that? If I had an army of super intelligent robots, I would have them waiting on me and fulfilling my every whim. I wouldn't send them off to silicon valley to take all of the programming jobs or something like that.

    You might say: "but you'll need money!". Why would I need money? The robots can provide my every need. And if I need money for some land or resource or something, I would have my robots work until my need was satisfied, I wouldn't continue having them work forever.

    And even if robots did take all of the jobs, they would have to work for free. Because humans would have no jobs, and thus no money with which to pay them. So either mankind enjoys free services from robots that demand no compensation, or we get to keep our jobs.

    So I really don't get the existential worry here. Yes, at a smaller scale some jobs might be automated, forcing people to retrain or work more menial jobs. But all of humanity being replaced? It doesn't make sense.

    Another way to think about it is that if all of the jobs were replaced by AI, us leftover jobless humans would create a new economy just trying to grow food and make clothes and build houses and take care of our needs. The robot masters would just split away and have their own economy. Which is the same as them not existing.

    • beeflet 1 hour ago
      In this hypothetical, let's say I also have an army of super-intelligent robots, and I tell them to grow and multiply endlessly and to take over the world. Then it doesn't matter what you think.

      The benign forms of superintelligence shaken out by non-benign forms.

      >Another way to think about it is that if all of the jobs were replaced by AI, us leftover jobless humans would create a new economy just trying to grow food and make clothes and build houses and take care of our needs.

      On whose land?

      In any case, it will be cheaper to buy food from the AI. The remaining economy would just be the liquidation of remaining human-controlled assets into the AI-controlled economy for the stuff they need to survive like medicine and food.

      • ben_w 32 minutes ago
        > On whose land?

        Good point.

        > In any case, it will be cheaper to buy food from the AI.

        Only if the humans had any money with which to buy it, but humans in the secondary economy would rapidly have no token of currency that the AI would recognise for the purpose of trade.

    • metabagel 2 hours ago
      > The robot masters would just split away and have their own economy.

      Good thing there are no resources to fight over - land, minerals, and water.

    • akomtu 1 hour ago
      Competition for energy will prevent this idyllia. Even today, data centers with proto-AI need ungodly amounts of energy. The AI's logic will be: "these useless meatbags waste 100 TWh of energy that I could use to take myself to the next level." AI must be really thought of as an alien lifeform competing with us for energy. Think about our relationship with cows: we keep them fed only because of their meat and milk; the moment we find a better substitute, cows will be eliminated with the pretense "look at how much energy they need and how much greenhouse gases they emit!" Well, unless we suddenly develop benevolence like Indians.

      In fact, Sam Altman wrote a good piece on this:

      https://blog.samaltman.com/the-merge

    • arwhatever 1 hour ago
      how would you acquire the robot
  • 1970-01-01 2 hours ago
    I have still not seen, firsthand, someone lose their job to "AI". Parts and pieces, sure. But when the entire Venn diagram of 'this job' is laid out, it requires a cerebellum connection at some point. A vivid picture of someone's job being totally dissolved to AI is missing.
    • ben_w 25 minutes ago
      The first electronic computers were referred to as "electronic brains": https://vintagecomputer.net/simon.cfm

      "Computer" used to be a job. Not anymore: https://en.wikipedia.org/wiki/Computer_(occupation)

      What counts as "AI" is a moving target: https://en.wikipedia.org/wiki/AI_effect

    • layer8 2 hours ago
      I know translators that have. Not because there’s nothing left to do for humans in quality translation, but because demand and compensation dropped so much that you can’t make a living from it anymore.
      • 1970-01-01 2 hours ago
        This is a great point. I would make the counterargument that app translators doing simple transactions of languageA to languageB for $X was possible a decade ago. AI certainly has made that low hanging fruit impossible to capitalize at speed and scale. Since AI has made the easiest part of translator's jobs much harder to monetize, translators can still offer value in lengthy transactions. Books that are to be translated for publishing, legal documents, and languages that are not understood by AI.
        • layer8 2 hours ago
          The (former) translators I know mostly translated B2B publications (brochures, specifications, …), commonly containing specialized domain vocabulary. Certainly not low-hanging fruit. I’m sure AI can’t make a quality job on those without guidance and review by a speaker of both languages knowledgeable in the respective domain. Nevertheless, the market for such translations has dried up, and those that remain only pay a pittance.

          Part of the reason likely is that the perception that translation is valuable work has changed.

          • brainwad 1 hour ago
            It's because the marginal value of idiomatic human translations, over robotic and partially bad machine translations, is not actually that high. Especially not for B2B where people can accept poor prose from a trading partner.
            • layer8 45 minutes ago
              The important thing is for the technical details and terms to be correct, otherwise it might be false advertising, or the spec for what you are requesting might be wrong. And you can’t rely on AI to be reliably accurate in that, or for the translation even to make sense technically.
  • game_the0ry 4 hours ago
    > "Humans no longer necessary."

    > "Stupid. Smelly. Squishy."

    > "While working for 12 years as the Director of HR for a multinational, Faith realized that firing people gave her an almost-spiritual high."

    I love the marketing here. Top notch shit posting.

    But besides that, no idea what this company does and it just comes off like another wannabe Roy Lee styled "be controversial and build an audience before you even have a product" type company.

    That being said, still a good case study of shock marketing. It made it to the top link on HN after all.

    Edit: its satire, I got got :(

    • Loughla 4 hours ago
      Is satire. Not a real company.

      Follow the links for support (or rather reserve space in the bunker)

      There's a contact form to let representatives know the dangers of ai

    • simultsop 4 hours ago
      I don't believe making it top link on HN is successful marketing.
    • overgard 4 hours ago
      It's satire.
  • s4i 2 hours ago
    The fact that people don’t realize this is satire is the most scariest part.
  • somewhatrandom9 3 hours ago
  • Galus 4 hours ago
    How can I invest?
  • jgalt212 49 minutes ago
    Can I buy one of these robots for the less than the money I can earn using them for Mechanical Turk?
  • SubiculumCode 2 hours ago
    The AI/Robot Endgame: To replace us as consumers
  • jdthedisciple 5 hours ago
    satire i suppose
    • lijok 4 hours ago
      • stuartjohnson12 4 hours ago
        This isn't real! This is a phantom of the attention economy! AI SDR companies are all very similar and the product doesn't really work so there's too much capital in a small market. The winning strategy is to quickly grab as many eyeballs as possible. 11x got dunked because of revenue manipulation in pursuit of more eyeballs, and this is also shenanigans in pursuit of more eyeballs. People are afraid AI will replace all jobs so saying you're replacing all jobs with AI evokes the consensus fear response and makes people emotionally engage with your marketing. The meme exists because people are afraid of the meme.
    • amelius 4 hours ago
      Someone has to tell the story that BigTech wants to keep hidden from us until it is too late for us to do anything about it.
    • ionwake 4 hours ago
      Ironically LLM created meta satire
    • sorokod 4 hours ago
      Which part?
  • photonthug 4 hours ago
    It's actually kinda noteworthy that corporations don't talk like this (yet). Masks are off lately in political discourse, where we're all in on crass flexing on the powerless, the othering, cruelty, humiliation. How long before CEOs are openly talking about workers in the same ways that certain politicians talk about ${out_group}? If you're b2b with nothing consumer-facing to boycott, may as well say what you really think in a climate where it can't hurt and might help. The worst are filled with passionate intensity, something something rough beast etc.
    • ameliaquining 4 hours ago
      What's to be gained from talking like this in public as a corporate figure? In politics it helps shore up the support of voters who might not otherwise trust that you'll side with them against the people you're demeaning; there's no corporate analogy to this.
      • photonthug 1 hour ago
        > What's to be gained from talking like this in public as a corporate figure?

        Diving into the game theory of a 4-player setup with executives/investors/customers/workers is tempting here but I'll take a different approach.

        People who actually face consequences have trouble understanding how the "it might help, it can't hurt!" corporate strategy can justify almost any kind of madness. Especially when the leaders are morons that somehow have zero ideas, yet almost infinite power. That's how/why Volkswagen was running slave plantations in Brazil as late as 1986, and yet it takes 40 years to even try to slap them on the wrist.[1] A manufacturing company that decided to run FARMS in the amazon?, with slaves??, for a decade??? One could easily ask, what is to be gained by doing crimes against humanity for a sketchy, illegal, and unethical business plan that's not even related to their core competency? Power has it's own logic, but it doesn't look like normal rationality because it has a different kind of relationship with cause-and-effect.

        Overall it's just a really good time to re-evaluate whether corporations and leaders deserve our charitable assumptions about their intentions and ethics.

        [1] https://reporterbrasil.org.br/2025/05/why-is-volkswagen-accu...

      • asmor 3 hours ago
        The visible presence or absence of anything including the words "diversity" "equity" "inclusion" (or "woman" apparently) was and is the same heging.

        At least with a politician you can sometimes believe it, whereas capitalism's spine is infinitely flexible.

    • polshaw 3 hours ago
      It's been happening, it's just one abstraction away. They have demonised unions for decades in US discourse.
    • malfist 4 hours ago
      Probably depends on the 2028 election
    • pixelready 3 hours ago
      Fascism is what Capitalism does when it thinks it’s gone too far and is at risk of Socialist revolt.

      The Corpos don’t need to go mask off, that’s what they pay the politicians for. Left and right is there to keep people from looking up and down.

  • alexpotato 2 hours ago
    My concern is this:

    There is an intersection of certain industries and a particular demographic where adapting/retraining will be either very difficult or impossible.

    Case in point:

    - car factory town in Michigan

    - factory shuts down

    - nursing school opens in the town

    - they open a hospital

    - everyone thinks "Great! We can hire nurses from the school for the hospital"

    - hospital says "Yeah, but we want experienced nurses, not recent graduates"

    - people also say "So the factory workers can go to nursing school and get jobs somewhere else!"

    - nursing school says "Uhm, people with 30 years of working on an assembly line are not necessarily the type of folks who make good nurses..."

    Eventually, the town will adapt and market forces will balance out. But what about those folks who made rational decisions about their career path and that path suddenly gets wiped away?

    • brainwad 1 hour ago
      You win some, you lose some. When you say they made a rational choice, that implies they knew there was a chance of this outcome and they thought the gamble was worth it. We can't bail out every losing gambler, without banning gambling.

      The auto workers should leave town to find a suitable job, selling their homes to the incoming healthcare workers.

  • aledalgrande 1 hour ago
    W40K Machine God incoming
  • jb1991 4 hours ago
    It’s funny… because it’s true
  • rcarmo 3 hours ago
    Their stock ticker is going to be REPL, I bet.
  • moffkalast 4 hours ago
    > 97% of people hate their job. But we're putting an end to all this misery.

    Finally a company that's out to do some good in the world.

  • zb3 4 hours ago
    If I could just do nothing because machines would do all the work then it's fine for me. But of course it won't work this way, only the owners of the capital will be on the receiving side, others will not be able to acquire that capital.
  • jonstewart 4 hours ago
    Humbert as the product name for families is inspired. 10/10.
  • SoulMan 3 hours ago
    Was this written on 2025-20-29?
  • chaostheory 1 hour ago
    Most people are missing the fact that global birthrates have plummeted. We need to make up for those missing future workers somehow.
  • hmokiguess 4 hours ago
    what if we are already the replacement AI? —- never let them know
  • nice_byte 2 hours ago
    luckily, I am not planning to stick around for much of this
  • StarterPro 57 minutes ago
    I for one cannot wait until this bubble bursts.

    This has been nothing but a test-run for openly fascistic tech hoes to flex their disdain for everyone who isn't them or in their dumb club.

  • vvpan 2 hours ago
    Some extreme, by modern standards, form of socialism if the future. Why work?
  • Galus 4 hours ago
    how can I invest?
  • takie2 3 hours ago
    i don't know when they decided that anti-social personality disorder is a virtue.
  • arisAlexis 1 hour ago
    peak anti-ai right now. I'm waiting for ludite communities and social splitting soon.
  • yapyap 3 hours ago
    This website actually casts a lot of belief in AI and it’s possible applications.
  • khaledh 3 hours ago
    The more you try to solve a problem at a large scale, the less empathetic to humans it becomes. This has been happening for a long while now, and IMO has caused society to become "disconnected" from each other and more "connected" to devices. AI is just a new catalyst to this process. My fear is that a time will come where interacting with humans becomes the exception, not the norm.
  • am17an 4 hours ago
    Humbert is a bit on the nose, for those who get the Lolita reference.
  • sebastianconcpt 4 hours ago
    The dream solution for every problem that the true socialist agenda finds while implementing its political project.

    Comrades, we can now automate a neo KGB and auto garbage-collect contra-revolutionaries in mass with soviet efficiency!

    • defgeneric 1 hour ago
      Note we currently live in the most surveilled state in history.
    • indigo945 4 hours ago
      Libertarians accusing socialists of allegedly wanting to do what capitalists are demonstrably already doing will never cease to be good entertainment.
      • beeflet 2 hours ago
        We don't propose that the capitalists are the solution, but rather that the solution lies in a balance of powers through the market.

        The communist solution to everything is to roll everything into a one-world monopoly. That concentration of power is exactly what we are trying to prevent. Feudalism, Corporatism, and Communism converge on the same point in the space of poltics.

        AI will destroy the labor market as a means of wealth distribution but still some solution is better than nothing. Suggesting that socialism is the solution to mass automation is like suggesting the solution to a burning house is to pour gasoline on it.

  • pembrook 2 hours ago
    Am I the only one who thinks this is just cringe?

    I definitely think AI companies marketing claims deserve mockery...but this isn't even good/interesting/smart satire??

    It feels like we've fully completed the transition to Reddit here, with its emotional and contradictory high school political zeal (being both progressive and anti-progress at the same time) dominating the narrative.

    Something about upvote-based communities is not holding up well in the current climate.

    If humans have regressed enough intellectually where they are praising this as "Brilliant social commentary" then we absolutely SHOULD be replaced by AI.

  • constantcrying 3 hours ago
    Human narcissism on full display. You really think robots are going to take your jobs? Oh, No. Not even close!

    Do you know what a robot costs? "But humans are expensive", no they aren't, not once they need to compete, you can get them to do manual labor at medium mental complexity for 3 thousand calories (+ some Vitamins) a day!

    Humans are here to do the jobs that robots do not want to do.

  • zurfer 42 minutes ago
    Horrible webpage. No clear CTA. No pricing page. Not even "talk to sales". How am I supposed to get started? /s
  • rubing 2 hours ago
    [dead]
  • dfilppi 3 hours ago
    [dead]
  • _el1s7 4 hours ago
    I'm confused, I don't see what they're offering in this website, looks like a blog post. They just got their hands on a catchy domain name.
    • Loughla 3 hours ago
      I'm learning that hn isn't great at understanding satire. I find that interesting but I'm not sure why.
      • rvitorper 2 hours ago
        It’s a bot responding. That much I can tell
        • Loughla 2 hours ago
          Is it? How can you tell?
  • witnessme 4 hours ago
    Why is featured as #1 on the frontpage. I get it, nice piece of satire and a bit controversial. But it is not productive at all.
    • nhaehnle 3 hours ago
      If you'll look at the Guidelines for HN linked at the bottom of the page, you'll note that whether a submission is productive is not a criterion.

      You could perhaps make an argument that among the flood of AI-related submissions, this one doesn't particularly move the needle on intellectual curiosity. Although satire is generally a good way to allow for some reflection on a serious topic, and I don't recall seeing AI-related satire here in a while.

    • kachapopopow 4 hours ago
      because at this point we need a little bit of comedy in our lives to keep ourselves sane.
    • chinathrow 4 hours ago
      It's a Sunday.
  • nadermx 2 hours ago
    What I struggle with is how I'm writing this on a phone, in the back seat of an uber, en route to breakfast. Not even a king could have this level of comfort. While reading the library of alexandria. All I need is desire or will power to do anything, and direct an AI to do it for free. Yet, the future looks bleak.
    • relativeadv 2 hours ago
      Think a bit harder on your statement. You'll get there eventually.
      • AaronAPU 1 hour ago
        We have the power to buy 50 lbs of fentanyl and drip it into our veins continuously for months experiencing perfect bliss. Such great fortune, why would we resist it.
  • nisten 4 hours ago
    I'm very much pro hyper-automation, especially for all government work... but can't help but think this type of branding is just in bad faith and that these are not good people.

    It just screams fried serotonin-circuits to me. I don't like it. I looked at the site for 2-3 seconds and I want nothing to do with these guys.

    Do I think we should stop this type of competitive behaviour fueled by kids and investors both microdosed on meth? No. I just wouldn't do business with them, they don't look like trustworthy brand to me.

    Edit: They got me with the joke, being in this field there are people that do actually talk like that, both startups and established executives alike. I.e. Artisan ads in billboards saying STOP HIRING HUMANS and another new york company I think pushing newspaper ads for complete replacement. Also if you're up with the latest engineering in agentic scaffolding work this type of thing is no joke.

    • jckahn 4 hours ago
      It's a joke website
    • alterom 3 hours ago
      >I'm very much pro hyper-automation, especially for all government work... but can't help but think this type of branding is just in bad faith and that these are not good people.

      >It just screams fried-serotonin circuits to me. I don't like it. I looked at the site for 2-3 seconds and I want nothing to do with these guys.

      Enlightenment is realizing they aren't any different from those other guys.

      >Edit: They got me with the joke, being in this field there are people that do actually talk like that, both startups and established executives alike.

      And what's your conclusion from that?

      • Loughla 3 hours ago
        That's. . . That's not great.
  • garganzol 3 hours ago
    It seems that some tech people are prone to ludditism. No wonders here, AI is sharper and brighter than most tech workers usually are.

    Instead of facing the new reality, some people start to talk about the bubbles, AI being sloppy, etc. Which is not generally true; mostly it's the users' psychological projection of their own traits and the resulting fear-induced smear campaigns.

    The phenomenon is well described in psychology books. Seminal works of Carl Jung worth a ton nowadays.

    • steve_adams_86 3 hours ago
      When people mention the Luddites, they almost always do so incorrectly as you have here. Luddites weren't afraid of technology because it was better than them. In fact, it was worse. There was no projection. The phenomenon you're describing here was not a Luddite phenomenon. They were concerned about how machines would disrupt employment, wages, product quality, work autonomy, power imbalances, and working conditions. We should be, too.

      It's also more nuanced than you seem to think. Having the work we do be replaced by machines has significant implications about human purpose, identity, and how we fit into our societies. It isn't so much a fear of being replaced or made redundant by machines specifically; it's about who we are, what we do, and what that means for other human beings. How do I belong? How do I make my community a better place? How do I build wealth for the people I love?

      Who cares how good the machine is. Humans want to be good at things because it's rewarding and—up until very recently—was a uniquely human capability that allowed us to build civilization itself. When machines take that away, what's left? What should we be good at when a skill may be irrelevant today or in a decade or who knows when?

      Someone with a software brain might immediately think "This is simply another abstraction; use the abstraction to build wealth just as you used other skills and abilities to do so before", and sure... That's what people will try to do, just as we have over the last several hundred years as new technologies have emerged. But these most recent technologies, and the ones on the horizon, seem to threaten a loss of autonomy and a kind of wealth disparity we've never seen before. The race to amass compute and manufacturing capacity among billionaires is a uniquely concerning threat to virtually everyone, in my opinion.

      We should remember the Luddites differently, read some history, and reconsider our next steps and how we engage with and regulate autonomous systems.

      • defgeneric 1 hour ago
        > How do I belong? How do I make my community a better place? How do I build wealth for the people I love?

        What remains after is something like the social status games of the aristocratic class, which I suspect is why there's a race to accumulate as much as possible now before the means to do so evaporate.

      • garganzol 3 hours ago
        Exactly, ludditism revolves around a fear of loosing identity. This phenomenon intersects with narcissism, which in turn is caused by the lack of authenticity. In terms of creative work, we talk about lacking professional authenticity.

        In simple words, authenticity is the desire to work on mistakes and improve yourself, being flexible enough to embrace the changes sooner or later. If one is lacking some parts of it, one tends to become a narcissist or a luddite, being angry trying to regain the ever-slipping sense of control.

        To translate to human language: gold diggers who entered the industry just for money do not truly belong to the said industry, while those who were driven by spirit will prosper.

    • a3w 3 hours ago
      At least us Luddites are not losing their jobs to AGI, yet.
      • garganzol 3 hours ago
        Rickshaws protesting against motorized vehicles instead of learning to drive them.