Browsed by
Category: data-quality

Spidermon: Scrapinghub’s Secret Sauce To Our Data Quality & Reliability Guarantee

If you know anything about Scrapinghub, you know that we are obsessed with data quality and data reliability.

Outside of building some of the most powerful web scraping tools in the world, we also specialise in helping companies extract the data they need for their mission-critical business requirements. Most notably companies who:

  • Rely on web data to make critical business decisions, or;
  • ...

Meet Spidermon: Scrapinghub’s Battle Tested Spider Monitoring Library [Now Open Sourced]

Your spider is developed and we are getting our structured data daily, so our job is done, right?

Absolutely not! Website changes (sometimes very subtly), anti-bot countermeasures and temporary problems often reduce the quality and reliability of our data.

The Rise of Web Data in Hedge Fund Decision Making & The Importance of Data Quality

Over the past few years, there has been an explosion in the use of alternative data sources in investment decision making in hedge funds, investment banks and private equity firms.

These new data sources, collectively known as “alternative data”, have the potential to give firms a crucial informational edge in the market, enabling them to generate alpha.

Sign up now

Be the first to know. Gain insights. Make better decisions.

Use web data to do all this and more. We’ve been crawling the web since 2010 and can provide you with web data as a service.

Tell me more

Welcome

Here we blog about all things related to web scraping and web data.

If you want to learn more about how you can use web data in your company, check out our Data as a Services page for inspiration.

Follow Us

Learn More

Recent Posts