Documentation Index
Fetch the complete documentation index at: https://developers.scrapeunblocker.com/llms.txt
Use this file to discover all available pages before exploring further.
The API is plain HTTP, so Node’s built-in fetch (Node 18+) is enough - no SDK to install. Store your API key in an environment variable:
export SCRAPEUNBLOCKER_KEY="su_live_..."
Fetch page HTML
const params = new URLSearchParams({ url: "https://example.com" });
const response = await fetch(
`https://api.scrapeunblocker.com/getPageSource?${params}`,
{
method: "POST",
headers: { "x-scrapeunblocker-key": process.env.SCRAPEUNBLOCKER_KEY },
}
);
const html = await response.text();
Get parsed JSON instead of HTML
Pass parsed_data=true and the API returns structured data extracted via Schema.org, __NEXT_DATA__, or AI-generated rules.
const params = new URLSearchParams({
url: "https://www.amazon.com/dp/B08N5WRWNW",
parsed_data: "true",
});
const response = await fetch(
`https://api.scrapeunblocker.com/getPageSource?${params}`,
{
method: "POST",
headers: { "x-scrapeunblocker-key": process.env.SCRAPEUNBLOCKER_KEY },
}
);
const payload = await response.json();
console.log(payload.data.page_type); // "product"
console.log(payload.data.data.title);
console.log(payload.data.data.price);
URLSearchParams coerces every value to a string. Pass "true" and "2", not true and 2.
Scrape a Google SERP
const params = new URLSearchParams({
keyword: "best running shoes",
pages_to_check: "2",
});
const response = await fetch(
`https://api.scrapeunblocker.com/serpApi?${params}`,
{
method: "POST",
headers: { "x-scrapeunblocker-key": process.env.SCRAPEUNBLOCKER_KEY },
}
);
const serp = await response.json();
for (const r of serp.organic) {
console.log(r.position, r.title, r.url);
}
Force a country
const params = new URLSearchParams({
url: "https://www.amazon.de/dp/B08N5WRWNW",
parsed_data: "true",
proxy_country: "de",
});
const response = await fetch(
`https://api.scrapeunblocker.com/getPageSource?${params}`,
{
method: "POST",
headers: { "x-scrapeunblocker-key": process.env.SCRAPEUNBLOCKER_KEY },
}
);
Capture cookies and the proxy used
const params = new URLSearchParams({
url: "https://example.com",
get_cookies: "true",
});
const response = await fetch(
`https://api.scrapeunblocker.com/getPageSource?${params}`,
{
method: "POST",
headers: { "x-scrapeunblocker-key": process.env.SCRAPEUNBLOCKER_KEY },
}
);
const { html, cookies, proxy } = await response.json();
const cookieJar = Object.fromEntries(cookies.map(c => [c.name, c.value]));
Fetch an image as PNG bytes
import { writeFile } from "node:fs/promises";
const params = new URLSearchParams({ url: "https://example.com/photo.jpg" });
const response = await fetch(
`https://api.scrapeunblocker.com/getImage?${params}`,
{
method: "POST",
headers: { "x-scrapeunblocker-key": process.env.SCRAPEUNBLOCKER_KEY },
}
);
await writeFile("photo.png", Buffer.from(await response.arrayBuffer()));
Concurrency control with p-limit
For high-throughput crawls, cap concurrent in-flight requests to your plan’s limit.
import pLimit from "p-limit";
const limit = pLimit(10);
async function scrape(url) {
const params = new URLSearchParams({ url, parsed_data: "true" });
const r = await fetch(
`https://api.scrapeunblocker.com/getPageSource?${params}`,
{
method: "POST",
headers: { "x-scrapeunblocker-key": process.env.SCRAPEUNBLOCKER_KEY },
}
);
return r.json();
}
const urls = [
"https://www.amazon.com/dp/B08N5WRWNW",
"https://www.amazon.com/dp/B07FZ8S74R",
];
const results = await Promise.all(urls.map(u => limit(() => scrape(u))));
Retries that handle every failure mode
403 rotates country once. 408, 503, 504 retry with backoff. 401, 422 fail fast.
const RETRYABLE = new Set([408, 503, 504]);
async function fetchWithRetry(url, params = {}) {
let rotated = false;
let last;
for (let attempt = 0; attempt < 4; attempt++) {
const qs = new URLSearchParams({ url, ...params });
const r = await fetch(
`https://api.scrapeunblocker.com/getPageSource?${qs}`,
{
method: "POST",
headers: { "x-scrapeunblocker-key": process.env.SCRAPEUNBLOCKER_KEY },
}
);
if (r.ok) return r;
last = r;
if (r.status === 403 && !rotated) {
params = { ...params, proxy_country: "us" };
rotated = true;
continue;
}
if (RETRYABLE.has(r.status)) {
await new Promise(res => setTimeout(res, 2 ** attempt * 1000));
continue;
}
break;
}
throw new Error(`HTTP ${last.status}: ${await last.text()}`);
}
const response = await fetchWithRetry("https://example.com", { parsed_data: "true" });
See handling failures for what each status code means.