May 10, 2024

Gemini 1.5 with a 1 million token context window

Gemini 1.5 with a 1 million token context window

Selecting model Gemini 1.5 Pro in drop down
Selecting model Gemini 1.5 Pro in drop down

Gemini 1.5 is now live in Go as a preview model. You can now work with larger and more text-dense files.

Gemini 1.5 has a 1 million token context window, almost 10x greater than GPT4 Turbo (128k) and circa 5x more than the Claude 3 family (200k).

For reference, 128k tokens equate to approximately 300 pages of text, so Gemini 1.5 can eat entire academic textbooks, whereas other models are more limited in the size of files they can understand.

This is still a preview model and note that large files may take a long time to compute. If you find any issues or have questions, reach out to us.

Bill Leaver

Bill Leaver

Product manager