-
Notifications
You must be signed in to change notification settings - Fork 4
Configuring a response
There are several overloads to configure a response. Here are just a few:
mockHttp
.When(...)
.Respond(with => with
.StatusCode(200)
.Body("text data")
.ContentType("text/plain", Encoding.UTF8)
);
mockHttp
.When(...)
.Respond((context, cancellationToken) => new HttpResponseMessage(HttpStatusCode.BadRequest));
mockHttp
.When(...)
.Respond(with => with
.StatusCode(200)
.JsonBody(new Person { FullName = "John Doe" }) // Requires 'skwas.MockHttp.Json' package
)
To throw an exception in response to a request:
mockHttp
.When(...)
.Throws<InvalidOperationException>();
Latency is defined as the time it takes for a request to arrive at the server. We can simulate this by introducing an artificial delay using the Latency()
extension. When the request is sent and handled by the mock handler, it will wait using a Task.Delay()
for the specified amount of time before returning the configured response.
using static MockHttp.NetworkLatency;
mockHttp
.When(...)
.Respond(with => with
.StatusCode(200)
.Latency(ThreeG)
);
// Or some of the other overloads, like:
mockHttp
.When(...)
.Respond(with => with
.StatusCode(200)
.Latency(Around(TimeSpan.FromMilliseconds(100)))
);
For convenience, MockHttp includes several helpful functions which adds some variance to the latency with the NetworkLatency
helper type (TwoG
, ThreeG
, FourG
and FiveG
), but we can of course also specify the delay with TimeSpan
s using Around()
and Between()
.
As mentioned above, the Latency()
extension only simulates the time it takes for a request to arrive at the server (MockHttp in this case). The response data returned by MockHttp is then returned nearly instantaneously since it is running in-proc with the test fixture and quite likely is being served from memory or from a fast hard drive (depending on your MockHttp response setup). If we want to simulate more realistic network transfer rates we can use the stream wrapper RateLimitedStream
, which will limit the amount of bytes that can be read from the response stream by the client, by a specified bit rate.
We can use the stream directly to wrap another stream:
using Stream stream = ...; // A big stream
mock.When(...)
.Respond(with => with
.Body(() => new RateLimitedStream(stream, 512_000)) // Rate limited to 512 kbps
);
Or use the helper extension which works with any type of content returned:
byte[] data = ...;
mock.When(...)
.Respond(with => with
.Body(data)
.TransferRate(512_000) // Rate limited to 512 kbps
);
Tip: and of course, you can combine latency and transfer rate together!
For more complex response configurations and/or reusability implement IResponseStrategy
.
public class MyResponseStrategy : IResponseStrategy
{
public Task<HttpResponseMessage> ProduceResponseAsync(MockHttpRequestContext requestContext, CancellationToken cancellationToken)
{
// Custom response logic.
}
}
mockHttp
.When(...)
.RespondUsing(new MyResponseStrategy());
An added benefit is it helps to keep the unit tests themselves clean.
Multiple responses can be configured to form a sequence. This is useful when a request is expected to happen multiple times, but with different responses.
Use cases are for example:
- test resilience by tripping a circuit breaker/retry logic and only succeed after nth request.
- scroll/paginating API's which return a subset of a larger list of data
The Respond
, RespondUsing
, Throws
response configuration extensions can all be chained to form a sequence.
mockHttp
.When(...)
.Respond(with => with.StatusCode(HttpStatusCode.BadGateway))
.Respond(with => with.ClientTimeout(TimeSpan.FromMilliseconds(500)) // TaskCancelledException after 500 milliseconds
.Respond(with => with.StatusCode(HttpStatusCode.Ok))
.Throws<HttpRequestException>()
.Respond(with => with.StatusCode(HttpStatusCode.Ok))
.RespondUsing(new MyResponseStrategy())
.Respond(with => with.StatusCode(HttpStatusCode.Accepted))
The last configured response will be repeated if more requests are executed. As per the above example, the 7th request and any subsequent request would all receive a status code
Accepted
.