automates online data scraping uses point and click interface to reduce data extraction from days to minutes. Automatic normalisation allows mixing of data sources, making data instantly comparable and accessed through API calls, or spreadsheets. addresses the elephant in the technology industry's room - everyone scrapes online data. Using powerful point and click data extraction tooling, a task which took days coding brittle scrapers is now reduced to a few minutes.

Automatic normalisation allows users to mix hundreds of data sources, making data instantly comparable and accessed through a single API call, or pulled into a spreadsheet. Businesses are using import-io to extract online data for everything from data journalism, retail trend analysis, through commercial intelligence for due diligence, to personalized events listing services. is currently in Beta, but you can request an invite.

Learn more at

Here is how one of the users benefits

PlanvinePlanvine could be described as a personalised Time Out, who need to suck in data from hundreds of online sources. To do that they had looked at coding scrapers (the only other way of doing what we do), each of which would have taken developers anything up to a week to create, and required significant ongoing maintenance as they frequently break. Fortuitously they came across, and are now capable of extracting data from a new source in four minutes, without developer effort. The data is automatically normalised by, allowing them to request nearly 500 data sources in a single API call. They've removed the requirement to hire expensive developers for this task, and were able to launch their beta in 2.5 months.