Facebook iconWhat is Google Gemini CLI & how to install and use it?
Blogs/AI

What is Google Gemini CLI & how to install and use it?

Jul 29, 20252 Min Read
Written by Sharmila Ananthasayanam
What is Google Gemini CLI & how to install and use it? Hero

Ever wish your terminal could help you debug, write code, or even run DevOps tasks, without switching tabs? Google’s new Gemini CLI might just do that.

Launched in June 2025, Gemini CLI is an open-source command-line AI tool designed to act like your AI teammate, helping you write, debug, and understand code right from the command line.

What is Gemini CLI?

Gemini CLI is a smart AI assistant you can use directly in your terminal. It’s not just for chatting, it’s purpose-built for developers.

Whether you're reviewing code, fixing bugs, managing Git workflows, or generating docs, Gemini CLI helps you get it done faster. It uses a clever ReAct loop (Reason + Act) to make decisions. This means it doesn’t just give answers, it reasons, acts, and interacts with both your local system and remote model context servers (called MCPs) to perform tasks intelligently.

How to Get Started with Gemini CLI?

Before you begin, make sure you have Node.js v18 or higher installed on your system.

Option 1: Use without installing

You can try it instantly using:

npx https://github.com/google-gemini/gemini-cli

No setup required.

Partner with Us for Success

Experience seamless collaboration and exceptional results.

Option 2: Install it globally

If you plan to use it often, install it globally:

sudo npm install -g @google/gemini-cli

After installation, just type gemini in your terminal to launch it. 

Installing Gemini globally

Select your desired theme and click Enter.

Authentication Options for Using Gemini CLI

Once Gemini CLI starts, it’ll ask you to authenticate. You’ve got three options:

  1. Login with Google
    • 60 requests per minute
    • 1000 requests per day
  2. Use Gemini API Key
    • Either create a.env file in your project folder and add the line: GEMNI_API_KEY=<your_gemini_api_key> to use your Gemini API key or just enter it in your terminal directly: 

> export GEMINI_API_KEY=<your_gemini_api_key>

  • Free Tier:
    • Flash model only
    • 10 requests/min
    • 250 requests/day
  • Paid Tier:
    • Quota depends on your tier
  1. Use Vertex AI
    • Available for paid accounts
    • Quotas depend on your tier

What Can Gemini CLI Do?

Gemini CLI is more than just an AI chatbot. Here are some of its coolest features:

  • Understand Big CodebasesAsk it questions about your entire codebase, even beyond a 1 million token context window.
  • Generate Apps from PDFs or Sketches Leverage Gemini’s multimodal power to create code from designs or documents.
  • Automate DevOps Tasks Handle pull requests, resolve complex rebases, or run scripts, all with a single prompt.
  • Extend with Tools and MCP Servers Connect tools like Imagen, Veo, or Lyria to add media generation.
  • Ground Queries with Google Search Tap into the world’s knowledge using the built-in Google Search tool.

Partner with Us for Success

Experience seamless collaboration and exceptional results.

Helpful Tips

  • Press / anytime to open the command helper.
  • Type /docs and hit Enter to open the official Gemini CLI documentation in your browser.

Final Thoughts

Gemini CLI is a smart and flexible tool that helps with coding tasks. It works great for small personal projects or large teamwork, making it easier to write, fix, and manage code right from your terminal.

What makes it stand out?

  • Open-source transparency
  • A generous free tier
  • Advanced AI from Google Gemini
  • Deep integration with developer tools
  • Multimodal support (text, code, images, and more)

And most importantly, it runs in the terminal, your favourite place to work.

Author-Sharmila Ananthasayanam
Sharmila Ananthasayanam

I'm an AIML Engineer passionate about creating AI-driven solutions for complex problems. I focus on deep learning, model optimization, and Agentic Systems to build real-world applications.

Phone

Next for you

How to Use Hugging Face with OpenAI-Compatible APIs? Cover

AI

Jul 29, 20254 min read

How to Use Hugging Face with OpenAI-Compatible APIs?

As large language models become more widely adopted, developers are looking for flexible ways to integrate them without being tied to a single provider. Hugging Face’s newly introduced OpenAI-compatible API offers a practical solution, allowing you to run models like LLaMA, Mixtral, or DeepSeek using the same syntax as OpenAI’s Python client. According to Hugging Face, hundreds of models are now accessible using the OpenAI-compatible client across providers like Together AI, Replicate, and more.

Transformers vs vLLM vs SGLang: Comparison Guide Cover

AI

Jul 29, 20257 min read

Transformers vs vLLM vs SGLang: Comparison Guide

These are three of the most popular tools for running AI language models today. Each one offers different strengths when it comes to setup, speed, memory use, and flexibility. In this guide, we’ll break down what each tool does, how to get started with them, and when you might want to use one over the other. Even if you're new to AI, you'll walk away with a clear understanding of which option makes the most sense for your needs, whether you're building an app, speeding up model inference, or cr

What is vLLM? Everything You Should Know Cover

AI

Jul 29, 20258 min read

What is vLLM? Everything You Should Know

If you’ve ever used AI tools like ChatGPT and wondered how they’re able to generate so many prompt responses so quickly, vLLM is a big part of the explanation. It’s a high-performance engine to make large language models (LLMs) run faster and more efficiently.  This blog effectively summarizes what vLLM is, why it matters, how it works and how developers can use it. Whether you’re a developer looking to accelerate your AI models or simply curious about the inner workings of AI, this guide will