Thread
Multimodal!

Cannot wait to try this out
"GPT-4 generally lacks knowledge of events that have occurred after the vast majority of its data cuts off (September 2021), and does not learn from its experience."

Surprising that it has the same knowledge cut-off as GPT-3 - was it trained on the same data?
"gpt-4 has a context length of 8,192 tokens. We are also providing limited access to our 32,768–context (about 50 pages of text) version, gpt-4-32k"

That's what I was most hoping for: enables things like summarization of full academic papers, rather than having to split them
Bing was GPT-4 all along!

The "GPT-4 Technical Report" is a 98 page PDF which has a whole bunch of dense additional information that may not have been reported widely yet cdn.openai.com/papers/gpt-4.pdf
Mentions
See All