Rate limiting is a critical part of building secure, scalable, and reliable APIs. Without it, your API is vulnerable to abuse such as brute-force attacks, denial-of-service (DoS), and excessive resource consumption.
This blog explains what rate limiting is, why it matters, and how to implement it in ASP.NET, covering both ASP.NET Core and classic ASP.NET Web API (MVC 5).
What Is Rate Limiting?
Rate limiting restricts how many requests a client can make to an API within a specific time window.
Examples:
- 100 requests per minute per IP
- 10 login attempts per minute per user
- 2,000 emails per day per account
When the limit is exceeded, the server returns HTTP 429 – Too Many Requests.
Why Rate Limiting Is Important
Rate limiting helps you:
- Prevent brute-force and credential-stuffing attacks
- Protect server resources and database connections
- Ensure fair usage across users
- Improve system stability under heavy load
In production systems, rate limiting is considered a baseline security feature, not an optional enhancement.
Rate Limiting in ASP.NET Core (.NET 7+)
ASP.NET Core provides built-in rate limiting middleware, making implementation clean and efficient.
Step 1: Configure Rate Limiting
Add the following configuration in Program.cs:
using System.Threading.RateLimiting;
builder.Services.AddRateLimiter(options =>
{
options.AddFixedWindowLimiter("fixed", opt =>
{
opt.Window = TimeSpan.FromMinutes(1);
opt.PermitLimit = 100;
opt.QueueLimit = 0;
});
options.RejectionStatusCode = StatusCodes.Status429TooManyRequests;
});
Step 2: Enable Middleware
app.UseRateLimiter();
Step 3: Apply Rate Limit to an API
[EnableRateLimiting("fixed")]
[ApiController]
[Route("api/sample")]
public class SampleController : ControllerBase
{
[HttpGet]
public IActionResult Get() => Ok("Request successful");
}
This configuration allows 100 requests per minute per client.
Per-IP Rate Limiting
To limit requests based on client IP:
options.AddPolicy("ip-policy", context =>
{
var ip = context.Connection.RemoteIpAddress?.ToString() ?? "unknown";
return RateLimitPartition.GetFixedWindowLimiter(
ip,
_ => new FixedWindowRateLimiterOptions
{
PermitLimit = 50,
Window = TimeSpan.FromMinutes(1)
});
});
Apply it using:
[EnableRateLimiting("ip-policy")]
Rate Limiting in ASP.NET MVC 5 / Web API 2
Classic ASP.NET does not include built-in rate limiting. The recommended approach is to use a custom action filter.
Custom Rate Limit Attribute
public class RateLimitAttribute : ActionFilterAttribute
{
private static readonly MemoryCache Cache = MemoryCache.Default;
private readonly int _limit;
private readonly int _seconds;
public RateLimitAttribute(int limit, int seconds)
{
_limit = limit;
_seconds = seconds;
}
public override void OnActionExecuting(HttpActionContext actionContext)
{
var ip = actionContext.Request.GetOwinContext()?.Request?.RemoteIpAddress ?? "unknown";
var count = (int?)Cache.Get(ip) ?? 0;
if (count >= _limit)
{
actionContext.Response = actionContext.Request
.CreateResponse((HttpStatusCode)429, "Too many requests");
return;
}
Cache.Set(ip, count + 1, DateTimeOffset.Now.AddSeconds(_seconds));
}
}
Apply Rate Limit
[RateLimit(100, 60)]
public IHttpActionResult Get()
{
return Ok("Success");
}
This allows 100 requests per minute per IP.
Distributed Rate Limiting (Production Scenario)
In load-balanced environments, in-memory cache is not enough.
Recommended Solution: Redis
Example Redis key:
ratelimit:192.168.1.10:/api/login
TTL: 60 seconds
This approach ensures consistency across multiple application instances.
HTTP Headers for Rate Limit Transparency
Best practice is to return rate limit details in headers:
X-RateLimit-Limit: 100
X-RateLimit-Remaining: 20
Retry-After: 60
These headers help API consumers handle throttling gracefully.
Common Rate Limiting Use Cases
- Login & OTP verification
- Password reset APIs
- Bulk email or SMS sending
- Public APIs and partner integrations
Conclusion
Rate limiting is essential for building secure and scalable ASP.NET APIs. Modern ASP.NET Core applications should leverage the built-in rate limiter, while legacy MVC 5 applications can rely on custom filters or Redis-backed solutions.
A well-designed rate limiting strategy protects your infrastructure, improves reliability, and ensures fair API usage for everyone.