Zero-shot prompting example using LangChain and Google Gemini:
# langchain-google-genaie contains LangChain integrations for Google’s Gemini models.
# It provides classes like ChatGoogleGenerativeAI, which let you use Gemini through LangChain.
# langchain-core contains message objects (HumanMessage, AIMessage), prompt templates,
# output parsers, core interfaces for chains and models
!pip install langchain-google-genai langchain-core
# ChatGoogleGenerativeAI is the LangChain wrapper for interacting with Google’s Gemini models
# HumanMessage represents user input in chat format
# os is used to access environment variables
# getpass allows entering the API key securely in the terminal (hidden input).
from langchain_google_genai import ChatGoogleGenerativeAI
from langchain_core.messages import HumanMessage
import os
from getpass import getpass
# Ask the user for their API key and store it so the model can use it
os.environ["GOOGLE_API_KEY"] = getpass("Enter your API key: ")
# Create an instance of the Gemini chat model
model = ChatGoogleGenerativeAI(
model="gemini-2.5-flash",
temperature=0.3
)
# Zero-shot prompt - Prepare a simple instruction for the model
prompt = "Give an easy-to-understand explanation of quantum computing."
# Send the message to the model and retrieve the reply
response = model.invoke([HumanMessage(content=prompt)])
print(response.content)