Preferences

GPT–2 was the first model that created long-form and coherent text.

The research led to a technology with a simple use case: input / output.

The "product" is a chatbot. input / output.

There is no innovation in interface or product design.

All the magic – all the "product innovation" is in it creating coherent text output.

(OpenAI's UI is not really that great).

All attempts at "product" (as in serving a specific use case tailored to a specific business or commercial workflow) that OpenAI have led to date – i.e. plugins – have been a huge flop.

I am uncertain who is behind assistants interface design to make RAG so easy. It's usability is underpinned by the larger context window of GPT-4T.

All of the magic is in the underlying technology. And I think Ilya's at the core of it. Yes, Attention Is All We Need is the Google paper that invented the transformer.

Ilya also worked at Google and was a key part of the ecosystem, and I imagine had indirect (and most likely very direct) contributions.

He is one of the most cited computer scientists of all time.

Without Ilya, I don't think we'd have the hype right now.

Someone would have come to the same conclusion, I'm sure.


This item has no comments currently.

Keyboard Shortcuts

Story Lists

j
Next story
k
Previous story
Shift+j
Last story
Shift+k
First story
o Enter
Go to story URL
c
Go to comments
u
Go to author

Navigation

Shift+t
Go to top stories
Shift+n
Go to new stories
Shift+b
Go to best stories
Shift+a
Go to Ask HN
Shift+s
Go to Show HN

Miscellaneous

?
Show this modal