Kyle Wiggers / TechCrunch:
Anthropic expands Claude’s context window from 9K to 100K tokens, or ~75K words it can digest and analyze; OpenAI’s GPT-4 has a context window of ~32K tokens — Historically and even today, poor memory has been an impediment to the usefulness of text-generating AI.
Lees verder op Tech Meme