The Complete Beginners Guide to Integrating Vercel with AWS

architecture diagram

When I first set out on this journey of finding the best way to integrate Vercel with AWS, I didn't know what to expect. Turns out, depending on your use case, there are several ways to go about it. Each with their own tradeoffs.

In this post, I'll show you how I recommend developers connect Vercel to AWS when they are new to the cloud. We'll use NextJS as our frontend framework and leverage API routes to securely connect to our AWS services. The application will be hosted on Vercel, and we'll use Clerk for authentication. Later in this post, I'll also explain why I use Clerk for authentication instead of what AWS has to offer. In addition,I'll also show how this setup can be used for both local development and production. Towards the end, I'll go over when and why you may want to use a few different approaches.

Why Couple AWS with Vercel?

AWS has over 250 services. However, when I teach AWS to frontend developers, I assure them they only need to know 5-8 at first. In addition, I focus on AWS services that can be treated like APIs, are scalable, and have on-demand pricing. In short, I focus on serverless services.

However, while it's possible to build fullstack applications just by using AWS services, there's something to be said about the Developer Experience (DX) of the platform. This has gotten better over the years, but none would say it's comparable to what you may find with Vercel.

By contrast, Vercel is a great platform for building fullstack applications. Their hosting platform makes it fast to get started but doesn't limit you when you want to scale for production. Unfortunately, my opinion is it forces a path towards over-reliance on 3rd party SaaS providers.

This series of posts is for those that want an a'la carte approach to having best-in-class DX, while also having highly-scalable services at low cost and under one umbrella.

Architecture Overview

architecture diagram

There are a few different way to appraoch tying together Vercel with AWS. I call this one the NextJS Proxy Pattern β„’. Here, we use NextJS API routes to proxy requests to our AWS services. This avoids us having to setup an AWS API (API Gateway + Lambda or AppSync) and instead rely on the built-in API routes of NextJS.

It works like this:

  1. A user visits our Vercel hosted application.
  2. This takes them to our NextJS frontend.
  3. From here, the user will likely signin with Clerk to visit protected pages.
  4. Once ready to make an API request, our backend API (NextJS API routes) will first fetch our AWS credentials.
  5. Once obtained, still in that API route, we will use the AWS SDK to make the request to our AWS service(s).

πŸ—’οΈ It's important to note that the AWS credentials themselves don't allow us to access any AWS services. The credentials are simply a set of keys that allow us to ask AWS if we have permission to perform a specific action. More on this in a bit.

Application Overview

To demonstrate how this will all come together, we'll bring a wordsearch game to life. The UI is already provided and can be found on GitHub here. On the main branch, the UI is filled with mock data so you can play around with it and get familiar. On the with-aws branch, you'll find the code with AWS services integrated. This is the branch we'll be working with for the rest of this post.

To install:

  1. Clone the repository: git clone https://github.com/focus-otter/wordsearch-game-ui.git
  2. Install the dependencies: npm install
  3. Start the development server: npm run dev
  4. Visit http://localhost:3000 to see the game.
  5. Switch to the with-aws branch: git checkout with-aws

Once the app is up and running, you should see the following:

wordsearch game ui

Secure Authentication with Clerk

Now that we have the UI up and running, let's secure our pages before we add any AWS services. Note that we could use Amazon Cognito for authentication, but as I've mentioned in other areas online, Clerk provides a better DX, a richer feature set, and is easier to setup.

To get started, head over to Clerk and signin or signup for an account. Once you have an account, you can create a new application. For this tutorial, I'm sticking with Email and Google, but feel free to add other social providers.

clerk Better Wordsearch

Next, we'll need to configure Clerk in our app. The steps within Clerk provide two options for doing this:

  1. If you are using an agent-based IDE like Cursor, VS Code, Windsurf, etc, you can simply click the "Copy Prompt" button and paste it into agent window.
  2. Following the steps on the Clerk page.

The end result is the same in either case and you'll receive a test publishable key and secret key. You'll need to add those keys to your .env.local file:

NEXT_PUBLIC_CLERK_PUBLISHABLE_KEY
CLERK_SECRET_KEY

Using the Clerk hosting pages, I've already configured the application for you. Feel free to look over the middleware.ts, layout.tsx, and navigation.tsx files to see what was implemented.

At this point, you should be able to visit the app, signin, and play around with the mock wordsearches. Let's keep this going by creating our AWS services.

Creating our AWS services

This application isn't very complex from a backend perspective. We'll use DynamoDB for our database and Amazon Bedrock for our AI service.

Amazon DynamoDB is a NoSQL database that provides fast and predictable performance with seamless scalability. It's a great choice for storing structured data like our wordsearches. I personally love working with DynamoDB because it can start off simple by just uploading JSON data to it (like we'll do in this tutorial), but also scale and handle advanced data-modeling needs as well.

Amazon Bedrock is a service that allows you to use large language models (LLMs) to generate text, images, and other content. It's a great choice for our AI service. The benefit of using Bedrock is that we can easily switch between different models from different providers by simply swapping out the model ID. This keeps development simple, while also keeping the costs under one roof.

This tutorial assumes you already have an AWS account. If you don't, I have a full playlist on YouTube that covers everything you need to know to get started. Click here to get started.

Creating a DynamoDB table

From the AWS console, type "DynamoDB" into the search bar and select "DynamoDB" from the list of services.

aws dynamodb

Click on "Create table" and enter the following details:

TableName: 'WordSearch_DEV'
Partition Key: 'PK' (String)
Sort Key: 'SK' (String)
aws dynamodb create table

In the Table Settings, keep the defaults and click "Create table".

Instead of naming our partition key something like "wordsearchID", naming it "PK" allows us to dynamically model our data. This will make sense as we build out our application.

That's it! You've now created a DynamoDB table πŸŽ‰

Every AWS resource has a unique ARN (Amazon Resource Name). To get the ARN of our DynamoDB table, click on the table and then click on the "Settings" tab. In the "General Information" section, you'll see the ARN. Copy that as we'll need it later.

Creating Amazon Bedrock Models

In our app, we want users to have the ability to click a button and have an LLM generate 10 words for them. Recall, that Bedrock is an umbrella service that allows us to use different models from different providers. Assuming this is your first time using Bedrock, we'll first need to enable the models we want to use.

Type "Bedrock" into the search bar and select "Bedrock" from the list of services. Scroll down to the "Configure and learn" section and select "Model access".

aws bedrock model access

From here, select "Modify model access" and click "Enable model access".

πŸ—’οΈ Per the screenshot above, after Oct 8th, 2025, this step will not be necessary.

From there, the easiest thing to do is just select all the checkboxes and click "Enable model access" and "Next".

That's it! You've now enabled a bunch of models in your AWS account. These are referenced in code by their model ID. To see a list of all models, whether they are for text, video, etc, and if they support streaming, click here.

For this tutorial, we'll use the Amazon Nova Lite amazon.nova-lite-v1:0 model.

Creating an IAM user for local development

We have our AWS services ready to use, but we need to specify who/what can access them. In AWS speak, there are 3 concepts you should know about and how they work together:

  1. IAM Policy - Simply put, a policy is a json document that defines some permissions (like reading/writing to our DynamoDB table).
  2. IAM Role - A collection of policies. AWS has a bunch of built-in roles, but we can also create our own.
  3. IAM User - A user is a person or application that needs to access AWS services. When you create a user, you can attach policies or roles to them.

In this tutorial, we'll create an IAM user that will be used to access our AWS services. This is great for local development, but when we deploy to production, we'll use a different approach.

Working a bit backwards, let's create our IAM policy first, and then attach it to our IAM user. In the AWS console, use the searchbar to search for IAM.

aws iam

On the sidebar and under "Access management", click on "Policies" and then click on the orange "Create policy" button.

From there, you'll have two options: Create a policy using the Visual editor or the JSON editor. While the Visual editor is great for learning what policies are available, we'll use the JSON editor for this tutorial. Paste in the following:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "BedrockModelAccess",
            "Effect": "Allow",
            "Action": "bedrock:InvokeModel",
            "Resource": "*"
        },
        {
            "Sid": "BedrockModelAccessWithResponseStream",
            "Effect": "Allow",
            "Action": "bedrock:InvokeModelWithResponseStream",
            "Resource": "*"
        },
        {
            "Sid": "DynamoDBAccess",
            "Effect": "Allow",
            "Action": [
                "dynamodb:PutItem",
                "dynamodb:DeleteItem",
                "dynamodb:GetItem",
                "dynamodb:Query",
                "dynamodb:UpdateItem"
            ],
            "Resource": [
                "arn:aws:dynamodb:YOUR_AWS_REGION:YOUR_AWS_ACCOUNT_ID:table/WordSearch_DEV"
            ]
        }
    ]
}

πŸ—’οΈ Replace YOUR_AWS_REGION and YOUR_AWS_ACCOUNT_ID with your actual AWS region and account ID. You can find your AWS account ID on the top-right corner of the AWS console.

The first and second json objects allows us to call any Bedrock model, whether that be for simple request/response or streaming. While we could have specified the model ID, I prefer to use the wildcard (*) to allow us to use any model since it makes it easier to switch models later.

The third json object allows us to read and write to our DynamoDB table.

Click on "Next" and name the policy "WordSearch_DEV_IAM_POLICY" and click "Create policy".

Now we can create an IAM user. Type "IAM" into the search bar and select "IAM" from the list of services. Click on the orange "Create user" and name the user "WordSearch_DEV_IAM_USER" and click "Next".

aws iam user

Click on the "Attach existing policies directly" radio button and select the policy we just created. Click "Next" and click "Create user".

aws iam user

That's it! You've now created an IAM user that can access our AWS services.

To make use of this user in our app, we'll create an access key and secret key. Click on the user and on the "Summary" tab, click "Create access key".

aws iam user

AWS will ask for the use case -- select "Local code" and check the confirmation box before selecting "Next".

aws iam user

Finally, click "Create access key". The next screen will show you the access key and secret key. In our .env.local file, we'll add the following, along with the actual values:

AWS_ACCESS_KEY_ID=
AWS_SECRET_ACCESS_KEY=
AWS_REGION=
WORDSEARCH_TABLE=WordSearch_DEV

Testing our application

We now have everything in place to ensure our NextJS application can access our AWS services. The simplest way to verify this is to see a simple API route that makes a request to our AWS service:

// api/aws/bedrock/route.ts

//... existing code

export async function POST(req: NextRequest) {
  const { isAuthenticated } = await auth() // from @clerk/nextjs/server
  if (!isAuthenticated) return new NextResponse("Unauthorized", { status: 401 })

  const body = await req.json()
  const { title } = body

  try {
    const result = await generateObject({ // from ai sdk
      model: bedrockClient("amazon.nova-lite-v1:0"),
      schema: z.object({ // from zod
        words: z.array(z.string()).length(10),
      }),
      prompt: `Generate exactly 10 words related to the theme "${title}". The words should be appropriate for a word search puzzle. Return only single words (no phrases), and make them varied in length between 4-10 letters. Focus on concrete nouns that relate to the theme.`,
    })

    return NextResponse.json(result.object)
  } catch (error) {
    console.error("Bedrock API error:", error)
    return new NextResponse("Failed to generate words", { status: 500 })
  }
}

Not only should you see the text on your webpage, it should be streaming in real time 🀯

While still on the with-aws branch, you'll see an additional API route that makes a request to our DynamoDB table along with authentication using Clerk in the api/aws/dynamodb/route.ts file.

Setting up Production with Vercel as an OIDC provider in AWS

This works great locally. And to be honest, we could recreate our backend data table, create a new IAM user for production, and add that to Vercel as environment variables and everything would work fine. But that's not the right way. To be honest, even creating resources using the AWS Console and using local credentials gives me the ick due to creating resources in the AWS console. But we'll address that in a future post.

No, for this section, the proper way is to let Vercel request temporary credentials from AWS and use those to access our AWS services. This is done using OIDC (OpenID Connect).

To set this up, we'll need to create a new IAM role and attach it to our Vercel project. This first requires our app to be deployed to Vercel. Go ahead and do that now if following along. While in the Vercel dashboard, be sure to make note of your team name and project name. We'll need those in the AWS Console.

How this works is that Vercel will request temporary credentials from AWS every time a request is made. This ensures that in the event of a data breach, our credentials will not be compromised.

To get started, back in the AWS Console, create a new DynamoDB table. Name it "WordSearch_PROD" and as before, use "PK" as the partition key and "SK" as the sort key. The rest of the settings can be kept as defaults.

We already have Amazon Bedrock enabled, so we can move on to creating our IAM role. Click on the Roles tab and then click the orange "Create role" button.

Name the role "WordSearch_PROD_IAM_ROLE" and click "Next".

aws iam role

Because Vercel is an outside source, we'll create what's known as a "trust policy" that allows Vercel to assume this role. Paste in the following:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Principal": {
                "Federated": "arn:aws:iam::[YOUR_AWS_ACCOUNT_ID]:oidc-provider/oidc.vercel.com/[YOUR_TEAM_NAME]"
            },
            "Action": "sts:AssumeRoleWithWebIdentity",
            "Condition": {
                "StringEquals": {
                    "oidc.vercel.com/[YOUR_TEAM_NAME]:aud": "https://vercel.com/[YOUR_TEAM_NAME]"
                },
                "StringLike": {
                    "oidc.vercel.com/[YOUR_TEAM_NAME]:sub": [
                        "owner:[YOUR_TEAM_NAME]:project:[YOUR_PROJECT_NAME]:environment:preview",
                        "owner:[YOUR_TEAM_NAME]:project:[YOUR_PROJECT_NAME]:environment:production"
                    ]
                }
            }
        }
    ]
}

πŸ—’οΈ Replace YOUR_TEAM_NAME, YOUR_AWS_ACCOUNT_ID, and YOUR_PROJECT_NAME with your actual team name, AWS account ID, and project name.

Click "Next" and name the role "WordSearch_PROD_IAM_ROLE" and click "Create role". Note that we did not add any policies to this role. Let's do that next.

aws iam role

Next, we'll attach a policy to our role. Click on the "Policies" tab and then click the orange "Create policy" button.

Paste in the following:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "BedrockModelAccess",
            "Effect": "Allow",
            "Action": "bedrock:InvokeModel",
            "Resource": "*"
        },
        {
            "Sid": "BedrockModelAccessWithResponseStream",
            "Effect": "Allow",
            "Action": "bedrock:InvokeModelWithResponseStream",
            "Resource": "*"
        },
        {
            "Sid": "DynamoDBAccess",
            "Effect": "Allow",
            "Action": [
                "dynamodb:PutItem",
                "dynamodb:DeleteItem",
                "dynamodb:GetItem",
                "dynamodb:Query",
                "dynamodb:UpdateItem"
            ],
            "Resource": [
                "arn:aws:dynamodb:YOUR_AWS_REGION:YOUR_AWS_ACCOUNT_ID:table/WordSearch_PROD"
            ]
        }
    ]
}

πŸ—’οΈ This is the same policy as the one we created for our development role except we're using the production table.

Now in Vercel, we'll need to add the following environment variables for our production and preview environments:

AWS_ROLE_ARN=
WORDSEARCH_TABLE=WordSearch_PROD
AWS_REGION=
CLERK_SECRET_KEY=
NEXT_PUBLIC_CLERK_PUBLISHABLE_KEY=

Now redeploy your application to Vercel. You should be able to signin and play around with the fully functional wordsearch game πŸŽ‰

πŸ—’οΈ In our code, you'll see how we handle local development and production differently. Specifically, we use the AWS_ROLE_ARN environment variable for production and the AWS_ACCESS_KEY_ID, and AWS_SECRET_ACCESS_KEY environment variables for local development.

Conclusion

In this post, we covered how to integrate Vercel with AWS by using the NextJS Proxy Pattern. This works great for local development and production. We also briefly touched on how easy it is to use Clerk to authenticate and protect parts of our app. In the next post we'll start to shift from creating resources in the AWS console and leverage infrastructure-as-code to create our AWS services.

If you enjoyed this post, I'd love to hear from you. Please drop a comment below or reach out on X or LinkedIn.

Until next time, Happy Coding 🦦