Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
The problem with generative AI is that it will always do what it is asked. For example, ask it for a description of George Washington's first grandchild "Susan" and it will give you a description. But of course Washington had no children.

These large models are trained to predict the words that should follow a prompt. They have zero understanding of anything at all.

What if you, with super-human effort, memorized the answers to a Chinese history exam that was written in Chinese a (and you don't know the language). In theory, if you were smart enough to could score 100% and not even know what the questions were asking or what your answers meant. This is a little like how AI works today. What makes the AI "smart" is is does not memorize answers but patterns for constructing answers. It stores a lot of patterns.

This is Apple's problem. No one wants a search engine that will find results for Washington's kids, Beethoven's 11 symphony or other made-up stuff. No, they absolutely can not think of all these cases in advance and check for them. There is literally an infinite number of "wrong facts".

Building a RELIABLE and ACCURATE large language mode (LLM)l has never been done. It may be impossible.

Google's search engine works very differently. It finds links, it does not claim to know anything at all.

I think the way to go is to have the LLM accept the user's query and then produce a good search phrase, send that to a standard search engine, then suberize the hits. This way there is a gap and the LMM is not making up the answers. This has not been done yet. Give them a year or two.
 
AI is the biggest threat to Apple's entire ecosystem.

GPT4 is demonstrating that an LLM can be your super personal assistant. And if LLMs can be your super personal assistant, your iPhone is suddenly no longer that important.

For example, need to plan and book a trip? Just tell your LLM to do it. It can even book tickets for you.

The flight is canceled and you're stranded? No problem. Ask your LLM to automatically call the flight company's customer service department to get a refund and check to see if their terms and service will provide a hotel for you. If so, have the LLM book a hotel.

None of this requires anything other than a screen, an internet connection, and your LLM personal assistant. You don't need to juggle different apps to do this. Just tell your LLM what you want. Therefore, it doesn't matter if you're using an iPhone or a $50 junk phone.
This seems to be a fundamental misunderstanding of Apple's business. So much so that I feel like you meant to write "Google" instead of "Apple" (and "Google search" or "Google search on a Pixel" instead of "iPhone").

Apple is a creator that makes great hardware and custom software to fit that hardware. Their income doesn't rely on people booking tickets for trips or whatever other stuff you wrote.

An LLM personal assistant needs to exist somewhere. An offline version by a company with a dedication to privacy and security is going to work out well. An awesome phone with great hardware and great software is still going to be fine.
 
It's going to ask you to confirm everything before booking.

Also, if you're going to ask it to book a trip to London, any decent AI would ask you to clarify if it's London UK or London Kentucky or where ever other Londons exist. Heck, as a rational human being, you should know that most humans would even assume that you meant London, UK. Therefore, if you're really going to London, Kentucky, you'd most certainly say so explicitly.
Real life use case from just last night. I have ChatGPT enabled on my car and was heading to an airport to pick up family. I asked if it could look up a flight status and it requested the airline and flight number. It came back with "I found 2 matching flights and based on the time and proximity, the flight landed 10 minutes ago." Simple use case, but it was very conversational.

So yeah, I can see myself using it as a personal assistant in the future.
 
  • Wow
Reactions: Pinkyyy 💜🍎
This your chance Apple. A solid conversing AI is still lacking. Your chance to be the forerunner after the decade of Siri neglect. Waste no time, and spend big.
 
Real life use case from just last night. I have ChatGPT enabled on my car and was heading to an airport to pick up family. I asked if it could look up a flight status and it requested the airline and flight number. It came back with "I found 2 matching flights and based on the time and proximity, the flight landed 10 minutes ago." Simple use case, but it was very conversational.

So yeah, I can see myself using it as a personal assistant in the future.

can I ask how you enabled chatGTP in your car?
 
Sure and hackers will have no problem hijacking the hardware and your life on a $50 phone. I would never ever venture out of the Apple Walled Garden and even better the future “Apple GPT Walled Garden”
You're not the customer Apple has to worry about. If Apple cannot continue to grow by stealing Android users, they will suffer. Android users primarily shop by price and a cheap phone that has better AI than Siri will become a threat if Apple does nothing more than talk about AI and not do anything.
 
  • Haha
Reactions: Pinkyyy 💜🍎
Apple's ecosystem is made up of their devices and services.

You'll still need a device to use A.I. and you're still going to buy an Apple Watch for its fitness and health tracking features.

How is A.I. going to replace an Apple sevice like Apple Pay, Card, Savings, iCloud, News, Music, TV+, etc.? Is A.I. going to make people consume less content (news, music, video, games)?


You're telling me people will be using their iPhone less? People won't be browsing Facebook/Instagram, X (Twitter), TikTok, etc? People are going to be taking less pictures/videos and posting them online? People are going to give up being influencers and content creators? On the contrary, I think generative A.I. will see more people creating and editing content on their iPhones.
I think their point isn't that people will be using their phones less, but just differently. Remember back 10 years ago when everyone was willing to give Siri a chance? Some people still use Siri regularly but the vast majority have been burned too many times and won't go back. But those people are willing to give a smarter, more intuitive and conversational Siri a chance especially if they see their friends and family doing cool things just by talking to their phones. People will telling their phones to pick the best photos without even looking at them first. People will be telling their phones to post a message about how great their day is going to their social media without having to write anything and choose and image. People will tell their phones to pick the top news stories of the day that are related to the book they are currently reading without having to peruse any news headlines. The possibilities are limited by Siri but are endless if the AI is good enough.
 
  • Wow
Reactions: Pinkyyy 💜🍎
For example, ask it for a description of George Washington's first grandchild "Susan" and it will give you a description. But of course Washington had no children.
IMG_6294.jpeg

And yet it’s you that speaks of ‘wrong facts’
 
Last edited:
OpenAI’s whisper API (voice to text) alone is 10x better than Google Assistant and 100x better than Siri, so if you can get that in a phone it will be a major competitive advantage. If Apple doesn’t get in on it now, it will be in big trouble 5 years down the line.
 
There’s so many applications for AI, but somehow people only focus on Google replacements, i.e. fact based answers, which is not LLMs strength.

In the meantime image generation, voice to text, text to voice, and many other applications of AI are completely changing the game, but people still will doubt it just because ChatGPT gets some silly factual question wrong.
 
This type of AI based assistant coupled with AR and in some respect VR is totally the future. I’m ready to be iron man, it can’t come quick enough.
 
  • Wow
Reactions: Pinkyyy 💜🍎
AI is the biggest threat to Apple's entire ecosystem.

GPT4 is demonstrating that an LLM can be your super personal assistant. And if LLMs can be your super personal assistant, your iPhone is suddenly no longer that important.

For example, need to plan and book a trip? Just tell your LLM to do it. It can even book tickets for you.

The flight is canceled and you're stranded? No problem. Ask your LLM to automatically call the flight company's customer service department to get a refund and check to see if their terms and service will provide a hotel for you. If so, have the LLM book a hotel.

None of this requires anything other than a screen, an internet connection, and your LLM personal assistant. You don't need to juggle different apps to do this. Just tell your LLM what you want. Therefore, it doesn't matter if you're using an iPhone or a $50 junk phone.
You think that AI is the biggest threat to Apple’s ecosystem because users won’t need an iPhone to book a flight or hotel? That hardware will all of the sudden not matter anymore? You must be from the future! Or another planet…
 
Apple’s been mostly introducing AI/ML quietly, as product improvements (for ex searching for objects, people, animals, text, etc through my photos has definitely been improving dramatically over time), so outside of a major replacement of Siri or a major new API (both of which are eventually likely, the latter would definitely make a more compelling case for the high end studio and MP if it leveraged the GPUs ability to access all the ram directly) I wouldnt expect big AI announcements.
 
It's going to ask you to confirm everything before booking.

Also, if you're going to ask it to book a trip to London, any decent AI would ask you to clarify if it's London UK or London Kentucky or where ever other Londons exist. Heck, as a rational human being, you should know that most humans would even assume that you meant London, UK. Therefore, if you're really going to London, Kentucky, you'd most certainly say so explicitly.
You're not wrong.

This happens all the time - and not just with London's. There was a karen tiktok vid just yesterday of someone going postal because their flight was to London England - girl didn't even look at her tickets and showed up to board the flight. Only thing that saved her was when they asked for her passport.
 
  • Wow
Reactions: Pinkyyy 💜🍎
If just an ounce of GPT's ability to hold a conversation and accurately transcribe words makes it in to Siri then I'm happy.

The number of sentences that make current Siri go look something up in Safari instead of speaking an answer or doing the thing I asked it is staggering and far beyond acceptable.

The only way to use Siri today is by learning which commands it accepts and how to pronounce and structure the sentence so that it fits and triggers the response you want.

Not much of a virtual assistant when you have to assist it in assisting you.
Siri is trash.

AI by Apple will be a dumpster fire.

But.... it won't ever turn into SkyNet because it won't be able to connect to anything useful. So there is that.
 
  • Disagree
Reactions: Halmahc and Tagbert
The problem with generative AI is that it will always do what it is asked. For example, ask it for a description of George Washington's first grandchild "Susan" and it will give you a description. But of course Washington had no children.

These large models are trained to predict the words that should follow a prompt. They have zero understanding of anything at all.

What if you, with super-human effort, memorized the answers to a Chinese history exam that was written in Chinese a (and you don't know the language). In theory, if you were smart enough to could score 100% and not even know what the questions were asking or what your answers meant. This is a little like how AI works today. What makes the AI "smart" is is does not memorize answers but patterns for constructing answers. It stores a lot of patterns.

This is Apple's problem. No one wants a search engine that will find results for Washington's kids, Beethoven's 11 symphony or other made-up stuff. No, they absolutely can not think of all these cases in advance and check for them. There is literally an infinite number of "wrong facts".

Building a RELIABLE and ACCURATE large language mode (LLM)l has never been done. It may be impossible.

Google's search engine works very differently. It finds links, it does not claim to know anything at all.

I think the way to go is to have the LLM accept the user's query and then produce a good search phrase, send that to a standard search engine, then suberize the hits. This way there is a gap and the LMM is not making up the answers. This has not been done yet. Give them a year or two.
That's be design because of AI phobia and worries these things will become self-aware and turn into SkyNet.

There was that Google Engineer that claimed their AI became sentient- he was shut down quickly by Google and learked it and Google denied it - but I am skeptical that he was FOS. Since then, Google's AI has been hamstrung and now the government pressure is on too - because everyone has this fear of the Terminator coming true or a Cylon invasion.

While I don't believe Cylons are coming - it does need to be approached carefully.

 
Share it. Show us. Based on your last message, you were looking at stock tickers? You realize that GPT4 is trained on old data and does not have live data right?

I've been amazed by GPT4 more than I've been let down. Clearly, it's not perfect but people here seem to think GPT4 isn't a huge deal.
Ask and ye shall receive.

I have a lot of fun with ChatGPT, but it can, and does, confidently lie. So you need enough knowledge in an area to know. As you can see from my chat, it had the correct information in its training set.

IMG_4810.jpeg
 
Apple will be the only tech corporation without an AI if it doesn't make one. Siri will look like she has brain damage compared to the AIs out there, she already does.
 
GPT4 is demonstrating that an LLM can be your super personal assistant. And if LLMs can be your super personal assistant, your iPhone is suddenly no longer that important.

For example, need to plan and book a trip? Just tell your LLM to do it. It can even book tickets for you.

The flight is canceled and you're stranded? No problem. Ask your LLM to automatically call the flight company's customer service department to get a refund and check to see if their terms and service will provide a hotel for you. If so, have the LLM book a hotel
Sorry to be harsh but you don’t understand how LLMs work.

They don’t understand what you’re saying. They calculate a statistically likely response to your input

This means they aren’t going to be able to _do_ anything like your examples above because that would require them to understand your input

This isn’t possible with current large language models because they don’t understand anything. Adding that could require entirely new engineering
 
Real life use case from just last night. I have ChatGPT enabled on my car and was heading to an airport to pick up family. I asked if it could look up a flight status and it requested the airline and flight number. It came back with "I found 2 matching flights and based on the time and proximity, the flight landed 10 minutes ago." Simple use case, but it was very conversational.

So yeah, I can see myself using it as a personal assistant in the future.
What are you talking about? GPT doesn’t have access to the internet. It can’t look up your flights for you
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.