From the course: Exploring Million-Token Models with Google Gemini Pro 1.5
The future of AI is huge - Gemini Tutorial
From the course: Exploring Million-Token Models with Google Gemini Pro 1.5
The future of AI is huge
- Something a bit unexpected happened on the way to artificial general intelligence, and that's the explosion of huge context windows. In other words, the amount of information that large language models like ChatGPT and Google Gemini can't remember when handling prompts. With Gemini 1.5, Google has taken context windows to a whole new level, letting users access up to a million tokens. But what does that mean in practical terms? These new tools let you access documents with more than 700,000 words, ask questions about an entire developer code base, and even handle questions in an hour's worth of video. So, now, that we've entered the age of massive context, what can that do for you? Should that affect your prompting engineering strategy? And is it worth the cost? Let's find out.