"If a worker wants to do his job well, he must first sharpen his tools." - Confucius, "The Analects of Confucius. Lu Linggong"
Front page > Programming > Adding API Rate Limiting to Your Go API

Adding API Rate Limiting to Your Go API

Published on 2024-11-08
Browse:918

Adding API Rate Limiting to Your Go API

Alright, folks, we’ve covered a lot so far: JWT authentication, database connections, logging, and error handling. But what happens when your API starts getting slammed with requests? Without control, high traffic can lead to slow response times or even downtime. ?

This week, we’re going to solve that by implementing rate limiting to control the flow of traffic. We’ll be using the simple and effective golang.org/x/time/rate package. Later, when my own ThrottleX solution is ready, I’ll show you how to integrate that as a more scalable option. (Psst, check out my GitHub at github.com/neelp03/throttlex for updates! Feel free to comment any issues you see in there o7)

Why Rate Limiting? ?

Rate limiting is like a bouncer for your API—it controls the number of requests users can make within a given timeframe. This prevents your API from getting overwhelmed, ensuring smooth and fair access for all users. Rate limiting is essential for:

  • Preventing Abuse: Stops bad actors or overly enthusiastic users from overwhelming your API.
  • Stability: Keeps your API responsive and reliable, even during traffic spikes.
  • Fairness: Allows resources to be shared equally among users.

Step 1: Installing the time/rate Package

The golang.org/x/time/rate package is part of the extended Go libraries and provides a straightforward token-based rate limiter. To get started, you’ll need to install it:


go get golang.org/x/time/rate


Step 2: Setting Up the Rate Limiter

Let’s create a rate-limiting middleware that controls the number of requests a client can make. In this example, we’ll limit clients to 5 requests per minute.


package main

import (
    "net/http"
    "golang.org/x/time/rate"
    "sync"
    "time"
)

// Create a struct to hold each client's rate limiter
type Client struct {
    limiter *rate.Limiter
}

// In-memory storage for clients
var clients = make(map[string]*Client)
var mu sync.Mutex

// Get a client's rate limiter or create one if it doesn't exist
func getClientLimiter(ip string) *rate.Limiter {
    mu.Lock()
    defer mu.Unlock()

    // If the client already exists, return the existing limiter
    if client, exists := clients[ip]; exists {
        return client.limiter
    }

    // Create a new limiter with 5 requests per minute
    limiter := rate.NewLimiter(5, 1)
    clients[ip] = &Client{limiter: limiter}
    return limiter
}


Step 3: Creating the Rate Limiting Middleware

Now, let’s use the getClientLimiter function in a middleware that will restrict access based on the rate limit.


func rateLimitingMiddleware(next http.Handler) http.Handler {
    return http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
        ip := r.RemoteAddr
        limiter := getClientLimiter(ip)

        // Check if the request is allowed
        if !limiter.Allow() {
            http.Error(w, "Too Many Requests", http.StatusTooManyRequests)
            return
        }

        next.ServeHTTP(w, r)
    })
}


How It Works:

  1. IP-Based Limiting: Each client is identified by their IP address. We check the client’s IP and assign a rate limiter to it.
  2. Request Check: The limiter.Allow() method checks if the client is within the rate limit. If they are, the request proceeds to the next handler; if not, we respond with 429 Too Many Requests.

Step 4: Applying the Middleware Globally ?

Now let’s hook up our rate limiter to the API so every request has to pass through it:


func main() {
    db = connectDB()
    defer db.Close()

    r := mux.NewRouter()

    // Apply rate-limiting middleware globally
    r.Use(rateLimitingMiddleware)

    // Other middlewares
    r.Use(loggingMiddleware)
    r.Use(errorHandlingMiddleware)

    r.HandleFunc("/login", login).Methods("POST")
    r.Handle("/books", authenticate(http.HandlerFunc(getBooks))).Methods("GET")
    r.Handle("/books", authenticate(http.HandlerFunc(createBook))).Methods("POST")

    fmt.Println("Server started on port :8000")
    log.Fatal(http.ListenAndServe(":8000", r))
}


By applying r.Use(rateLimitingMiddleware), we ensure that every incoming request is checked by the rate limiter before it reaches any endpoint.


Step 5: Testing the Rate Limiting ?

Start your server:


go run main.go


Now, let’s hit the API with some requests. You can use a loop with curl to simulate multiple requests in a row:


for i in {1..10}; do curl http://localhost:8000/books; done


Since we set the limit to 5 requests per minute, you should see 429 Too Many Requests responses once you exceed the allowed rate.


What’s Next?

And there you have it—rate limiting with golang.org/x/time/rate to keep your API stable and responsive under pressure. Rate limiting is a crucial tool for any scalable API, and we’re just scratching the surface here.

Once ThrottleX is production-ready, I’ll be posting a follow-up tutorial to show you how to integrate it into your Go API for even more flexibility and distributed rate limiting. Keep an eye on my ThrottleX GitHub repo for updates!

Next week, we’re going to containerize our API with Docker, so it’s ready to run anywhere. Stay tuned, and happy coding! ??

Release Statement This article is reproduced at: https://dev.to/neelp03/adding-api-rate-limiting-to-your-go-api-3fo8?1 If there is any infringement, please contact [email protected] to delete it
Latest tutorial More>

Disclaimer: All resources provided are partly from the Internet. If there is any infringement of your copyright or other rights and interests, please explain the detailed reasons and provide proof of copyright or rights and interests and then send it to the email: [email protected] We will handle it for you as soon as possible.

Copyright© 2022 湘ICP备2022001581号-3