Streaming File Data

Avatar of Hemanta SundarayHemanta Sundaray

For large files, loading the entire content into memory isn’t practical. HttpClientRequest.bodyStream() takes a Stream<Uint8Array> and sends it as the request body, reading and transmitting chunks incrementally.

http.ts
import {
FetchHttpClient,
HttpClient,
HttpClientRequest,
} from "effect/unstable/http";
import { Effect, Stream } from "effect";
function uploadLargeData() {
return Effect.gen(function* () {
const client = yield* HttpClient.HttpClient;
// Create a stream that produces chunks of data
// In practice, this might come from reading a large file
const encoder = new TextEncoder();
const dataStream = Stream.fromIterable([
encoder.encode("chunk 1: Hello "),
encoder.encode("chunk 2: from "),
encoder.encode("chunk 3: a stream!"),
]);
const request = HttpClientRequest.post("https://httpbin.org/post").pipe(
HttpClientRequest.bodyStream(dataStream, {
contentType: "text/plain",
contentLength: 47,
}),
);
const response = yield* client.execute(request);
const data = yield* response.json;
return data;
}).pipe(Effect.provide(FetchHttpClient.layer));
}
// Test it
Effect.runPromise(uploadLargeData()).then((data) => {
console.log("Body received:", data.data);
});

Output:

Terminal
Body received: chunk 1: Hello chunk 2: from chunk 3: a stream!

bodyStream takes a Stream<Uint8Array> and an optional options object with contentType and contentLength. The content length is optional but recommended. Some servers reject requests without a Content-Length header, and others use it to show upload progress.

The stream is consumed lazily. Each chunk is sent over the network as it’s produced, so only one chunk needs to be in memory at a time. This makes it suitable for uploading files of any size.

Sign in to save progress

Stay in the loop

Get notified when new chapters are added and when this course is complete.