this post was submitted on 20 Jun 2023
23 points (100.0% liked)
Technology
67 readers
2 users here now
This magazine is dedicated to discussions on the latest developments, trends, and innovations in the world of technology. Whether you are a tech enthusiast, a developer, or simply curious about the latest gadgets and software, this is the place for you. Here you can share your knowledge, ask questions, and engage in discussions on topics such as artificial intelligence, robotics, cloud computing, cybersecurity, and more. From the impact of technology on society to the ethical considerations of new technologies, this category covers a wide range of topics related to technology. Join the conversation and let's explore the ever-evolving world of technology together!
founded 2 years ago
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Probably "AI". It's not actual like AI in films , but it is a great program that can analyze human language and extrapolate pretty interesting data from a wide variety of sources, then reformulate that data into a human language that is concise and useful. I am personally excited about the prospect of a personal AI that can run locally or in a truly private cloud services that can learn my speech, my habits, my interests, etc and be ready to provide me with customized data from the Internet or from it's observation of me.
It's a double edge sword though. While a tool like this should be used to discover more truth about the world we live in, there will be many bad actors trying to use it to manipulate and lie to achieve their nefarious goals. AI will also need to be trained to detect when something is fake, and I feel that this will become a never ending arms race.
Maybe this is just a contrarian view, but I see "AI" as a potential rather than a technology. Right now, transformer-based technologies are what most of us mean when we talk about AI, and it's not clear to me how much more potential that idea really has. When I look at how much energy it takes to set up something like GPT-4 I see us pushing hardware to its limit and yet the outcomes are still too often unsatisfying. Significant breakthroughs are needed somewhere in that architecture just to do the kind of things we're trying to do today at the fidelity we expect and without breaking the bank.
The technology we have today might be to AI what the phonograph was to audio recording. As a technology we hit the limits of its potential pretty quickly and then… we fixated. Entirely different technologies eventually led to the lossless spatial audio experiences we can enjoy today, and seem more likely to carry future potential for audio too.
In that analogy, GPT might just be like someone arranging 8 gramophones in a circle to mimic the kind of spatial audio experience available in some headphones now. Impressive in many ways, but directionally not the path where potential lies.
I agree. Many people are fixated on GPT because it is shiny and novel, but it is certainly not the pinnacle of what AI could be, or even close. One day, we will look back on calling GPT an “AI” like we would someone calling the first two tin cans on a string a “phone”. Accurate enough, but certainly a far cry from any modern phone.
AI has the capacity to be the most impactful overall to our daily lives, but like most things, advancement will continue to be limited by hardware.
True, I think AI has the biggest potential of changing our lives in the near future. I don't think we are anywhere near generalised AI right now, but even the current LLVMs have amazing capabilities. I think there may be many ways we can apply these AIs that we haven't thought of yet.
Now here's to hoping that these AIs won't be monopolised by corporations but instead stay available to the general public.
I think there's going to be a lot of AIs, corporate and open, public and private. It's already happening. Things are going to get weird very fast.
This is why we should push against attempts to strangle open-source AI products and research.
Sadly, it’s probably going to be exactly what happens over time, since that is almost always what regulation does. Some company (Amazon or Meta or Microsoft or Google) will back door legislation via campaign donations to you know who’s, to make sure large regulated companies are the only ones who can run advanced AI models, out of “responsibility” and “safety”. And by seeding all these doom and gloom headlines of a “AI will take over the world” narrative, the public will be just so happy to give up the rights of other people for a thing “they weren’t going to do anyway”, like usual.
Yeah I agree although I hate the use of the name "AI". It's marketing nonsense driven by a gold rush to sell both companies to shareholders and get consumers interested.
The stuff we're seeing like ChatGPT is not AI; it's extremely powerful and impressive language based models, and current systems have an impressive ability to "remix" content into new content. But the technology isn't ready for the mass market yet - it's too inaccurate yet it's being rushed into search engines out of fear of being left behind.
There is also a huge issues around the data used to "train" these systems. Train is a misnomer - that data is taken and stored and pulled upon actively and constantly. Who owns the data, whose data has been used? Whose art works for example are used as the engine to drive the generative art by art AIs? When human users create and share content, that is being copied and stored, or constantly accessed to drive the AI. User content across the internet is being used to drive these systems, and yet it's not the original content makers who benefit; it's the AI fims and the tech firms who are opaquely selling and hoarding our data and content for commerical gain.
People don't understand how much data has been essentially stolen from us to drive this technology. Look at the reddit saga - one driver for locking down the API is lock out AI companies - not to protect your data or content, but purely so Reddit can monetise it. We are the products when we use social media; it used to be that we were sold to advertisers but now the content we share has been comandeered by social medial and tech giants and is being used to drive "AI". And because it's complex and technical, we're being screwed over en masse to line the pockets of tech company executives and share holders.