How to use AI to generate a newsletter
In this tutorial, we will utilize multiple Google Cloud Platform services to build auto-generated newsletters:
- Use Gemini API to generate a newsletter.
- Upload the newsletter to a GCS bucket.
- Use Cloud Build to build container images and store them in Artifact Registry.
- Create a Cloud Run service to show the content.
- Create a Cloud Run job to automatically generate a newsletter every week.
- Use Secret Manager to store the Gemini API KEY.
Setup the Gemini API Key
Go to aistudio.google.com to create an API Key. You need to select a Project for the API Key.
Copy the API key and add it to your ~/.bashrc or ~/.zshrc
export GEMINI_API_KEY=XXXXXXXXXXXXXXXX
Activate it:
. ~/.zshrc
Test the API Key. Note that for the line -H "X-goog-api-key: $GEMINI_API_KEY"
, you need to use double quotes so $GEMINI_API_KEY
can be replaced.
curl "https://generativelanguage.googleapis.com/v1beta/models/gemini-2.0-flash:generateContent" \
-H 'Content-Type: application/json' \
-H "X-goog-api-key: $GEMINI_API_KEY" \
-X POST \
-d '{
"contents": [
{
"parts": [
{
"text": "Explain how AI works in a few words"
}
]
}
]
}'
If it goes fine:
{
"candidates": [
{
"content": {
"parts": [
{
"text": "AI learns patterns from data to make predictions or decisions.\n"
}
],
"role": "model"
},
"finishReason": "STOP",
"avgLogprobs": -0.039194737871487938
}
],
"usageMetadata": {
"promptTokenCount": 8,
"candidatesTokenCount": 12,
"totalTokenCount": 20,
"promptTokensDetails": [
{
"modality": "TEXT",
"tokenCount": 8
}
],
"candidatesTokensDetails": [
{
"modality": "TEXT",
"tokenCount": 12
}
]
},
"modelVersion": "gemini-2.0-flash",
"responseId": "XXXXXXXXXXXX"
}
Otherwise, if the key is not working:
{
"error": {
"code": 400,
"message": "API key not valid. Please pass a valid API key.",
"status": "INVALID_ARGUMENT",
"details": [
{
"@type": "type.googleapis.com/google.rpc.ErrorInfo",
"reason": "API_KEY_INVALID",
"domain": "googleapis.com",
"metadata": {
"service": "generativelanguage.googleapis.com"
}
},
{
"@type": "type.googleapis.com/google.rpc.LocalizedMessage",
"locale": "en-US",
"message": "API key not valid. Please pass a valid API key."
}
]
}
}
After a successful call, verify the usage from the Cloud Console, go to APIs & Services page (https://console.cloud.google.com/apis/dashboard?project=PROJECT_ID), the number of Generative Language API requests should increase.
Create a script to generate a newsletter
Install Google GenAI SDK
pip install -q -U google-genai
Create a new file run.py
:
from google import genai
# The client gets the API key from the environment variable `GEMINI_API_KEY`.
client = genai.Client()
contents = "Generate a list of the airline route news in the past week, with source links"
response = client.models.generate_content(
model="gemini-2.5-flash", contents=contents
)
# Write the response to a file
with open("output.md", "w") as file:
file.write(response.text)
Run the script:
python3 run.py
Check the output, and you may find that Gemini did return a list of airline route news, HOWEVER, it's a list of old news from months ago, not exactly "in the past week". This is because the models have cutoff knowledge date, which is roughly when the model is trained. To get the latest result, we need to use the "Grounding Tool" to search for the latest content.
Modify the script:
from google import genai
from google.genai import types
client = genai.Client()
contents = "Generate a list of the airline route news in the past week, with source links"
# Define the grounding tool
grounding_tool = types.Tool(
google_search=types.GoogleSearch()
)
# Configure generation settings
config = types.GenerateContentConfig(
tools=[grounding_tool]
)
# Generate content with grounding
response = client.models.generate_content(
model="gemini-2.5-flash", contents=contents, config=config
)
# Write the response to a file
with open("output.md", "w") as file:
file.write(response.text)
Upload the files to a GCS bucket
Prefer to use -
in bucket names over _
. For DNS compliance and future compatibility, you should not use underscores in bucket names. Hyphens are considered standard DNS characters.
BUCKET_NAME=airline-routes-newsletter
# Create a new bucket in us-central1.
gcloud storage buckets create gs://${BUCKET_NAME} --location=us-central1
Add this to the script:
import datetime
from google.cloud import storage
# ...
print("Uploading the newsletter to GCS ...")
date_str = datetime.datetime.now().strftime("%Y-%m-%d")
storage_client = storage.Client()
bucket = storage_client.bucket("airline-routes-newsletter")
blob = bucket.blob("airline-routes-newsletter-" + date_str)
blob.upload_from_filename("output.md")
If you are the only one consuming this content, you can stop now. But if it only for you, why waste all these effort to generate a markdown instead of just ask Gemini when you need it?
So ... Let's productionize it.
Create a Cloud Run service to serve the content
Create an app.js to run a Node.js server.
const express = require('express');
const { Storage } = require('@google-cloud/storage');
const marked = require('marked');
// Use environment variables for flexibility. Cloud Run will automatically provide the PORT.
const PORT = process.env.PORT || 8080;
const GCS_BUCKET_NAME =
process.env.GCS_BUCKET_NAME || 'airline-routes-newsletter';
// Removed MARKDOWN_FILE_NAME; now dynamic per file
const app = express();
const storage = new Storage();
// Basic HTML template with Tailwind CSS for a clean, responsive look.
const HTML_TEMPLATE = `
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>GCS Markdown Renderer</title>
<script src="https://cdn.tailwindcss.com"></script>
<link rel="preconnect" href="https://fonts.googleapis.com">
<link rel="preconnect" href="https://fonts.gstatic.com" crossorigin>
<link href="https://fonts.googleapis.com/css2?family=Inter:wght@400;500;700&display=swap" rel="stylesheet">
<style>
body {
font-family: 'Inter', sans-serif;
background-color: #f8f8f8;
}
.prose {
max-width: 800px;
margin-left: auto;
margin-right: auto;
}
h1, h2, h3, h4, h5, h6 {
font-weight: bold;
}
</style>
</head>
<body class="bg-gray-100 p-8 flex items-center justify-center min-h-screen">
<div class="bg-white p-8 rounded-lg shadow-xl w-full max-w-4xl prose">
<!-- Markdown content will be inserted here -->
</div>
</body>
</html>
`;
// Middleware to set the Content-Type header
app.use((req, res, next) => {
res.setHeader('Content-Type', 'text/html');
next();
});
// Main route to fetch, convert, and render the Markdown file.
// Main route: list all files in the bucket as links
app.get('/', async (req, res) => {
try {
const bucket = storage.bucket(GCS_BUCKET_NAME);
const [files] = await bucket.getFiles();
const fileLinks = files
.map((file) => {
return `<li><a href="/file/${encodeURIComponent(file.name)}" class="text-blue-600 hover:underline">${file.name}</a></li>`;
})
.join('\n');
const htmlContent = `
<h1 class="text-2xl font-bold mb-4">Files in Bucket: ${GCS_BUCKET_NAME}</h1>
<ul class="list-disc pl-6 space-y-2">
${fileLinks}
</ul>
`;
const finalHtml = HTML_TEMPLATE.replace(
'<!-- Markdown content will be inserted here -->',
htmlContent
);
res.status(200).send(finalHtml);
} catch (error) {
console.error('Error listing files:', error);
const errorMessage = `<h1>Error:</h1><p>Could not list files from GCS. Please check your bucket. Error details: ${error.message}</p>`;
const finalHtml = HTML_TEMPLATE.replace(
'<!-- Markdown content will be inserted here -->',
errorMessage
);
res.status(500).send(finalHtml);
}
});
// Detail route: render the selected markdown file
app.get('/file/:filename', async (req, res) => {
const filename = req.params.filename;
try {
const bucket = storage.bucket(GCS_BUCKET_NAME);
const file = bucket.file(filename);
const [markdownContent] = await file.download();
const htmlContent = marked.parse(markdownContent.toString('utf8'));
const finalHtml = HTML_TEMPLATE.replace(
'<!-- Markdown content will be inserted here -->',
htmlContent
);
res.status(200).send(finalHtml);
} catch (error) {
console.error('Error retrieving or rendering markdown:', error);
const errorMessage = `<h1>Error:</h1><p>Could not retrieve Markdown file from GCS. Please check your bucket and file path. Error details: ${error.message}</p>`;
const finalHtml = HTML_TEMPLATE.replace(
'<!-- Markdown content will be inserted here -->',
errorMessage
);
res.status(500).send(finalHtml);
}
});
// Start the server
app.listen(PORT, () => {
console.log(`Server is running on http://localhost:${PORT}`);
});
Create a Dockerfile in the service folder:
# Use Node.js LTS image
FROM node:22
# Set working directory
WORKDIR /usr/src/app
# Copy package files and install dependencies
COPY package.json package-lock.json* ./
RUN npm install --production
# Copy the rest of the app
COPY . .
# Expose the port Cloud Run will use
ENV PORT=8080
EXPOSE 8080
# Start the app
CMD ["npm", "start"]
Build and push the Docker image:
Create a repo:
PROJECT_ID=YOUR_PROJECT_ID
SERVICE_NAME=airline-route-newsletters-service
gcloud artifacts repositories create container-images \
--repository-format=docker \
--location=us-central1
gcloud builds submit --tag us-central1-docker.pkg.dev/${PROJECT_ID}/container-images/${SERVICE_NAME} ./service
Deploy to Cloud Run:
gcloud run deploy ${SERVICE_NAME} \
--image us-central1-docker.pkg.dev/${PROJECT_ID}/container-images/${SERVICE_NAME} \
--platform managed \
--region us-central1 \
--allow-unauthenticated
By default, the Cloud Run service would be running with the Compute Engine default service account (i.e. [email protected]
) which has the Editor
role so it should have permission to read from GCS buckets. However, relying on these default permissions is generally considered a security risk. It violates the principle of least privilege by granting your application more permissions than it needs, which could be exploited if your VM is compromised.
It is recommanded to create a dedicated, user-managed service account.
To create a service account:
SERVICE_ACCOUNT_NAME=airline-route-newsletters-sa
gcloud iam service-accounts create ${SERVICE_ACCOUNT_NAME} \
--display-name="Service Account for the Airline Route Newsletter app."
Update the service account:
gcloud run services update ${SERVICE_NAME} \
--region=us-central1 \
--service-account="${SERVICE_ACCOUNT_NAME}@${PROJECT_ID}.iam.gserviceaccount.com"
Create a Cloud Run job to auto-generate newsletters
Create a dockerfile:
FROM python:3-slim
ENV GEMINI_API_KEY=""
WORKDIR /app
COPY run.py .
# Install dependencies
RUN pip install google-cloud-storage google-genai
CMD ["python3", "run.py"]
Build the image and store it in the Artifact Registry
PROJECT_ID=YOUR_PROJECT_ID
JOB_NAME=airline-route-newsletters-job
gcloud builds submit --tag us-central1-docker.pkg.dev/${PROJECT_ID}/container-images/${JOB_NAME} ./job
Enable Secret Manager:
gcloud services enable secretmanager.googleapis.com
# Create a new secret for the API Key:
echo "XXXXXXXXXXX" | gcloud secrets create GEMINI_API_KEY --data-file=-
Bind the service account with roles/secretmanager.secretAccessor
so it can read the API KEY from Secret Manager:
gcloud projects add-iam-policy-binding ${PROJECT_ID} \
--member="serviceAccount:${SERVICE_ACCOUNT_NAME}@${PROJECT_ID}.iam.gserviceaccount.com" \
--role="roles/secretmanager.secretAccessor"
Create the job, this does NOT execute the job
gcloud run jobs create airline-route-newsletters-job\
--region us-central1 \
--image us-central1-docker.pkg.dev/${PROJECT_ID}/container-images/airline-route-newsletters-job:latest \
--service-account="${SERVICE_ACCOUNT_NAME}@${PROJECT_ID}.iam.gserviceaccount.com" \
--set-secrets=GEMINI_API_KEY=GEMINI_API_KEY:latest \
--max-retries=0
Grant permissions
# Grant permissions to invoke cloud run jobs
gcloud run jobs add-iam-policy-binding ${JOB_NAME} \
--member="serviceAccount:${SERVICE_ACCOUNT_NAME}@${PROJECT_ID}.iam.gserviceaccount.com" \
--role="roles/run.invoker" \
--region=us-central1
# Grant permissions to delete and create GCS objects
BUCKET_NAME=airline-routes-newsletter
gcloud storage buckets add-iam-policy-binding gs://${BUCKET_NAME} \
--member="serviceAccount:${SERVICE_ACCOUNT_NAME}@${PROJECT_ID}.iam.gserviceaccount.com" \
--role="roles/storage.objectAdmin"
Execute the job
# Describe the job
gcloud run jobs describe airline-route-newsletters-job --region us-central1
# Execute the job
gcloud run jobs execute airline-route-newsletters-job --region us-central1
Maintenance
To update the job:
gcloud builds submit --tag us-central1-docker.pkg.dev/${PROJECT_ID}/container-images/${JOB_NAME} ./job