Retool, a development platform for business software, recently published the results of its State of AI survey. Over 1,500 people took part, all from the tech industry:...
Over half of all tech industry workers view AI as overrated::undefined
Quite the opposite. People who understand how LLMs work know their limitations. And AI in general is incapable deduction and creativity. It simply is not able to produce something new by using existing knowledge. Sure it can generate a number of outputs through some transformations of input data. But not create.
If you think developers, engineers, architects and others are going to lose their jobs you are severely mistaken. Even for content writers it’s a temporary setback because AI generated content is just limited and as soon as quality of human input to same AI starts dropping in quality so will AI’s output.
AI (LLMs) is just a tool and people will have to learn how to use it if they wanna keep their jobs, people who refuse to learn it will be left behind just like people who refused to use CAD software or people who think they can saw, drill, … by hand as fast as a machine.
Current AIs (LLMs) wont replace anyone, people who know how to use them will.
It’s a tool that you have to babysit, at least for foreseeable future. In general it’s always a bad idea for human to supervise the machine because in time we grow complacent of its results and that’s when the mistakes happen. When it tomes to writing some content, biggest problem is inaccuracies or some typo. Your comparison to CAD software is not a good one, since CAD doesn’t produce anything on its own. It’s a software assisting human, not generating content. Imagine the horror with CAD software auto-generated bridges. It would be only a matter of time before someone would just skip on double-checking what was generated. And I am fully aware there are AI generated structural parts and testing, but it’s a part of design process where results have to checked by a human again.
I do think AI has a place and purpose, but it’s not going to cost people their jobs, only help them do it more efficiently. It’s great in assisting people but not replacing. If there’s a manager out there who thinks AI can replace a human, then I can show you a bad manager who doesn’t understand what AI is. In the future we might arrive at a point in time where AI is good enough to do some jobs human find too repetitive or dangerous. But we are far from that.
Also, LLM is not something I’d call AI, or at least intelligent. They are specialized neural networks which are trained using human input and whose sole purpose is predicting the next word or sentence in relation to what’s entered as input. Glorified and overly complicated auto-complete. There’s no intelligence involved.
I do think AI has a place and purpose, but it’s not going to cost people their jobs, only help them do it more efficiently. It’s great in assisting people but not replacing
Thats my whole point, people who utilize AI(I agree with you calling LLMs AI is dumb but thats how it is currently used and more people understand “AI” then “LLM”) will do the job more efficiently and therefor you’ll need less people to do the same job (which is why i used CAD as an example because doing the drawings by hand will take way longer) which results in people losing their job because they are no longer needed
That’s not exactly how I view outcome of introducing new tools, but that’s will have to be agree to disagree part. In my opinion tools remove tedious tasks completely or make them easier giving you more time to focus on what matters.
Quite the opposite. People who understand how LLMs work know their limitations. And AI in general is incapable deduction and creativity. It simply is not able to produce something new by using existing knowledge. Sure it can generate a number of outputs through some transformations of input data. But not create.
If you think developers, engineers, architects and others are going to lose their jobs you are severely mistaken. Even for content writers it’s a temporary setback because AI generated content is just limited and as soon as quality of human input to same AI starts dropping in quality so will AI’s output.
AI (LLMs) is just a tool and people will have to learn how to use it if they wanna keep their jobs, people who refuse to learn it will be left behind just like people who refused to use CAD software or people who think they can saw, drill, … by hand as fast as a machine.
Current AIs (LLMs) wont replace anyone, people who know how to use them will.
It’s a tool that you have to babysit, at least for foreseeable future. In general it’s always a bad idea for human to supervise the machine because in time we grow complacent of its results and that’s when the mistakes happen. When it tomes to writing some content, biggest problem is inaccuracies or some typo. Your comparison to CAD software is not a good one, since CAD doesn’t produce anything on its own. It’s a software assisting human, not generating content. Imagine the horror with CAD software auto-generated bridges. It would be only a matter of time before someone would just skip on double-checking what was generated. And I am fully aware there are AI generated structural parts and testing, but it’s a part of design process where results have to checked by a human again.
I do think AI has a place and purpose, but it’s not going to cost people their jobs, only help them do it more efficiently. It’s great in assisting people but not replacing. If there’s a manager out there who thinks AI can replace a human, then I can show you a bad manager who doesn’t understand what AI is. In the future we might arrive at a point in time where AI is good enough to do some jobs human find too repetitive or dangerous. But we are far from that.
Also, LLM is not something I’d call AI, or at least intelligent. They are specialized neural networks which are trained using human input and whose sole purpose is predicting the next word or sentence in relation to what’s entered as input. Glorified and overly complicated auto-complete. There’s no intelligence involved.
Thats my whole point, people who utilize AI(I agree with you calling LLMs AI is dumb but thats how it is currently used and more people understand “AI” then “LLM”) will do the job more efficiently and therefor you’ll need less people to do the same job (which is why i used CAD as an example because doing the drawings by hand will take way longer) which results in people losing their job because they are no longer needed
That’s not exactly how I view outcome of introducing new tools, but that’s will have to be agree to disagree part. In my opinion tools remove tedious tasks completely or make them easier giving you more time to focus on what matters.