Preferences

My guess is that they host a version of the model locally on the iPhone.

Even if they don’t (or if it’s partially networked as some recent rumors suggest), it’ll be rolled into one or both of two predictable costs (to the consumer):

1. The device sale itself, either raising the ASP or offsetting some other cost (to Apple) savings

2. Recurring payments for iCloud (or any rebranding it might undergo along with the feature)

Apple’s pricing model, if not totally predictable, is exceedingly formulaic. If they deviate from these into some sort of nickel and diming on “AI” features alone, that would almost certainly be a clear sign that they’re betting against it as a long term selling point.

This indeed seems to have been a heavy focus of their research team in the past year, eg. "Efficient Large Language Model Inference with Limited Memory" [1] and OpenELM [2]

[1] https://arxiv.org/pdf/2312.11514

[2] https://arxiv.org/pdf/2404.14619 (with 1.1B parameters, this appears to be their attempt at building a lightweight LLM)

Maybe a very cut down version - any of the more recent and capable OpenAI models are surely far too large to put on an iPhone, and far too large to run (in terms of both memory available, and processing power).

This would maybe align with the 'limited abilities are free' approach.

This item has no comments currently.

Keyboard Shortcuts

Story Lists

j
Next story
k
Previous story
Shift+j
Last story
Shift+k
First story
o Enter
Go to story URL
c
Go to comments
u
Go to author

Navigation

Shift+t
Go to top stories
Shift+n
Go to new stories
Shift+b
Go to best stories
Shift+a
Go to Ask HN
Shift+s
Go to Show HN

Miscellaneous

?
Show this modal