Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.requestly.com/llms.txt

Use this file to discover all available pages before exploring further.

A HAR (HTTP Archive) file is a JSON-formatted log of a browser session’s network traffic. Every modern browser’s DevTools, plus proxy tools like Charles, Fiddler, and mitmproxy, can export a HAR. Requestly imports a HAR as a fully editable collection of requests - useful for replaying a captured flow, building an API collection from a real session, or sharing a reproducible bug report.

How to Import a HAR File

Import HAR modal with file upload area.
1

Capture a HAR file

In Chrome or Edge, open DevTools → Network, reproduce the flow you want to capture, then right-click any row and choose Save all as HAR with content. Firefox and Safari offer the same export from their network panels. Charles Proxy, Fiddler, and mitmproxy all support HAR export from their session menus.
2

Open the Import dialog

In the API Client, click the Import button in the top-left corner and choose HAR from the dropdown.
3

Select your .har file

Drag the file onto the upload area or click to browse. HAR files up to 30 MB are supported. Larger files are rejected before parsing - if your capture is bigger, narrow the recording window in DevTools and re-export.
4

Choose what to import

Requestly parses the file and shows a preview with two options:
  • All requests - every captured request, including images, scripts, stylesheets, and fonts.
  • Only API calls - JSON and XML APIs, form submissions, mutations, and CORS preflight requests. Static assets (images, CSS, JS, fonts) are skipped.
The count next to each option tells you how many requests will land in the collection. Pick the mode that matches what you want to do with the import.
5

Click Import

Requestly creates a collection named HAR_Import_<timestamp> containing every imported request. Each request is paired with an example holding the captured response, so you can compare the original payload against what you get when you replay.

How HAR Entries Map to a Collection

  • Root collection. Every import produces one collection at the root, named HAR_Import_YYYY-MM-DD_HH-MM-SS. The timestamp keeps repeated imports distinct in the sidebar.
  • Sub-collections from pages. If the HAR file includes a log.pages array (Chrome and Firefox both populate it on a per-tab navigation basis), each page becomes a sub-collection under the root, named after the page title. Requests are grouped under the page they belong to. Entries that don’t reference any page sit at the root level next to the sub-collections.
  • Requests and examples. Every HAR entry produces one request plus one example. The request is editable like any other Requestly request; the example holds the response that was captured (status, headers, body, timing) so you can replay against it without losing the original.
  • Request body. JSON, form-urlencoded, and multipart bodies are detected from the captured Content-Type and opened in the right editor. Other bodies open in the raw editor with the appropriate syntax (HTML, XML, JavaScript, plain text).
  • Cookies. When the HAR’s structured cookies[] array is present (Chrome and Firefox exports), it is the source of truth - the imported request gets a single Cookie header built from those entries, and any duplicate raw Cookie header is dropped. Captures from proxy tools that only carry cookies in the raw header are passed through verbatim.

What’s Skipped, and Why

Some HAR entries can’t be imported as Requestly requests. The preview surfaces a warning when this happens:
  • WebSocket connections. Entries with a ws:// or wss:// URL, or marked _resourceType: "websocket" by Chrome, are skipped - Requestly’s API Client doesn’t support WebSocket replay.
  • Binary request bodies. Some captures (analytics SDKs, gzip-compressed payloads) contain binary POST bodies with embedded null bytes. Requestly strips the null bytes so the body can be stored, and warns you that the imported request may not reproduce the original wire format byte-for-byte. The text-readable portion of the body is preserved.
If the HAR file is empty, isn’t valid JSON, or doesn’t contain a log.entries array, the import fails before the preview step with an error explaining what’s wrong.
HAR files often contain sensitive request and response data - auth tokens, cookies, personal information from response bodies. Treat a .har like a credential dump: don’t share it publicly, and scrub it before attaching to a bug report.

What’s Next?

Save to a Collection

Organize and rename the imported requests

Add Environment Variables

Replace hardcoded hosts and tokens with reusable variables

Run the Collection

Replay every captured request in sequence