Tesla to market a robot with a human-like appearance which would carry out the work people like to do least – IOTW Report

Tesla to market a robot with a human-like appearance which would carry out the work people like to do least

Hmmm. What would be the work people would like to do least?

(Reuters) – Tesla Inc (TSLA.O) CEO Elon Musk on Thursday said it will probably launch the prototype of a humanoid robot called “Tesla Bot” next year, saying the robot would “eliminate dangerous, repetitive, boring tasks.”

The robot with a human-like appearance would carry out the work people like to do least, with “profound implications for the economy,” Musk said at the company’s AI Day event on Thursday.

more

35 Comments on Tesla to market a robot with a human-like appearance which would carry out the work people like to do least

  1. Trying to wrap my head around how making 75% of the world unemployable improves the global economy?

    The reality of the human economy is it is about humans trading labor and skills and tasks with each other. The more robots you have to do those things, the more humans become completely irrelevant. The world then just becomes a giant hobo camp at that point.

    8
  2. But seriously, as someone who works with robots every day and, more relevantly, works with people who work with robots every day, let me tell you that YOU DO NOT WANT TO HUMANIZE THEM!!!

    It has nothing to do with them pretending to be “hooman” for nefarious purposes or because they’re “takin’ our jerbs”. See, the thing is, that robots are MUCH more powerful than humans, MUCH faster than humans, and … this is the IMPORTANT PART … DO NOT CARE ABOUT ANYTHING THEY ARE NOT PROGRAMMED TO CARE ABOUT OR HAVE A SENSOR FOR.

    …The thing about robots are that NO fucks are given. NONE. ZERO.

    They
    DO.

    NOT.

    CARE.

    The “brains” they have are the basic movement instructions and limitations that the OEM put on them, and then they get tailored by people like me for EXTREMELY specific purposes using their VERY LIMITED array of sensors that can break down or be fooled by something as simple as a dried bead of hard water. Whatever they do is what they are TOLD to do, no more, and no less.

    If a robot is programmed to do a specific thing, you are in the way of that specific thing, and it is NOT programmeed or sensored to DETECT you being in the way of that specific thing or do anything ABOUT you being in the way of that specific thing, it WILL do that specific thing THROUGH you.

    Without hesitation, without mercy, without even so much as awareness, it WILL try to do what it is programmed to do, and it will do it without warning. It won’t even think in terms of knocking you out of the way or knocking you down or anything anthropomorphic like that, it simply won’t think at ALL, it will execute its program unless the weight of your corpse exceeds its overload parameters or interferes with its tooling completing its movements to its sensors.

    Robots simply DO NOT CARE.

    …and you don’t get much of a visual cue from its size, either. I have a cartoner that uses three little robot arms, maybe about the size of a smaller adult woman’s (Faunuc LR Mate 200, for any robot buffs out there), so a human around them MIGHT think they would be overpowered if someone simply grabbed them, but that is NOT true.

    AT ALL.

    The servos in these would rip your arm apart or crush your hand and not even slow down significantly, or stop doing it over and over again unless the conditions it was programmed to load with ceased or one of you co-workers got over their shock long enough and wasn’t too stupid to figure out where the emergency stop is, and pressed it. It doesn’t care if you bleed or scream, it doesn’t even KNOW you did.

    Becasue it doesn’t know ANYTHING.

    So why not ‘humanize’ this? “freindly it up” as they put it in “I, Robot”?

    Because you don’t WANT people thinking of these in HUMAN terms. You don’t WANT them put off their guard. You don’t WANT any sort of simulated human appearance subconciously making people more at ease around them, relaxing them to think they are dealing with something that thinks, feels, and reacts like them.

    Because nothing could be further from the truth.

    People also don’t understand how weak they are relative to ANY machine, or how their chalky skeletons and easily damaged flesh is affected by the forces of acceleration and mass and their own inertia. Shortly after I begin this, we had a younger woman (<30) who was a Quality inspector on a filling machine (they checked the containers for various things both before and after fill, don't worry about the details but know she was SUPPOSED to be near this machine), and the filling area was not desinged with super guarding. This gal saw something on a pouch and reached out to wipe it, and it caught her sleeve just before it indexed forwards about 3 inches.

    It broke her arm in 2 places, dislocated her shoulder, and yanked her face into the fill nozzle in the space of under 1 second.

    Machines WILL hurt you BAD if you lose respect for them, and giving them a "face" WILL cause you to do EXACTLY THAT.

    …on a side note, one thing that's getting bandied about in modern robotics is so-called "Collaborative" robots, robots that are reduced in power and sensored in such a way that they are NOT required to be in cages, but rather CAN have humans working close to them, even next to them, with relatively little risk. Setting aside that they can't do a ton because of the speed and power reductions, the best they can do is slow down as someone approaches them, and STOP if someone touches them.

    This means that if you do not protect them ANYWAY, people WILL slow your production down just by being NEAR your robots.

    That also puts WAYYYY more faith in Chinese made sensors than I am comfortable with, because again, no matter HOW clever your program is, the robot is literally a slave to its inputs, and if the input doesn't work right, neither does the ROBOT. It's way too big a topic for right here, but machine "vision" does NOT work like HUMAN vision does, ultrsonic sensors can be deflected instead of returned, photo sensors can be wigged out by moisture, etc., all of which means I sure as HELL am not going to trust MY life to them even after working with them for over a quarter century, so why would YOU want to trust them with YOURS?

    …and this doesn't even get into things like the actual tooling and how the robot could frisbee the part its moving, or parts of itself, at you if it's worn, damaged, or not maintained well, but this is SO far into TL:DNR territory that no one's even SEEING this, so I'll end right here…

    4
  3. geoff the aardvark
    AUGUST 20, 2021 AT 12:35 PM
    “I hate robots! Maybe it’s because I’ve read too many sci fi stories about robots run amuck. We have enough problems already without robots trying to run and ruin our lives as it is.”

    …the ORIGINAL “Dune” backstory (NOT the one ruined by Frank Herbert’s kid) is probably closer to the truth. I wouldn’t worry about robot sentience, but men USING robots to enslave other men is no further away from reality than Facebook is from beimg the SOLE source of news in the world, if you watch Boston Robotics or the use of both flying and ambulating drones even NOW in some places for COVID enforcement and practical use in actual warfighting, its very do-able indeed…

    1
  4. Robots are indeed souless machines but that wont stop souless, heartless, amoral Demwits and other leftists giving robots rights equal to humans. There in lies the corrupted chip.

    2
  5. Uncle Al ʘ
    AUGUST 20, 2021 AT 12:36 PM
    “I’m trying to figure out how to program a robot to hunt down and kill other robots while making it impossible to hunt down and kill humans.”

    …very doable provided you have a sensor that keys on something specific to robots, such as a magnetic sensor for iron or a electrical field sensor for stainless/ plastic robots.

    AND no one can change YOUR program.

    …the original “Terminator 2” story said that the AhnoldBot was set to READ ONLY because Skynet didn’t want its drones “learning”. It only became a problem for Skynet when John Conner opened a little round head hatch and flipped a DIP to “LEARN” (why would Skynet even MAKE that? Typical villian…)

    …it would be like that…

    https://image.slidesharecdn.com/terminator2-judgmentday02-120929132322-phpapp02/95/terminator-2-judgment-day-02-16-728.jpg?cb=1348925114

  6. “”Tesla Bot” humanoid robot prototype next year,”

    So we can expect some advanced Roomba by 2028 in other words.

    Hey, it’s just keeping with Tesla’s track record of broken promises and milking the sheep for more money.

    3
  7. LOL, the most unwanted job would be White House Gynecologist. Think about it: Hilary, Big Mike, Kamala, Chelsea, Dr. Jill. It would be like pulling open a grilled cheese sandwich every time one of those hags came in.

    2
  8. …but it just doesn’t give them the same satisfaction while screaming at it that it’s not doing it the way THEY think it should be done or as quickly as THEY think it should be done.

    1

Comments are closed.