Linkedin lost in court to data analytic company that scrapes Linkedin’s public profiles info

On September 9th, 2019 the UNITED STATES COURT OF APPEALS 1 (District 9, California) has affirmed the former district court’s determination that a certain [data] analytic company is lawful to scrape [perform automated gathering] LinkedIn’s public profiles info. Now the historical event has happened in which a court is protecting a data extractor’s right for mass gathering openly presented business directory information. more…

Scraping with free or paid proxies – what is the difference?

Anything free always sounds appealing. And we are often ready to go an extra mile to avoid expenses if we can. But is it a good idea to choose the free option when it comes to using proxies for data scraping? Or should you stick to the paid ones for better results?

Let’s weigh all the pros and cons to see why you should consider using residential IP providers like Infatica, Luminati, NetNut, Geosurf and others.


What are the best online resources to acquire data?

Recently I received this question: What are the best online resources to acquire data from?

The top sites for data scrape are data aggregators. Why are they top in data extraction?
They are top because they provide the fullest, most comprehensive data [sets]. The data in them are highly categorized. Therefore you do not need to crawl and fetch other resources and then combine multiple-resource data.

Those sites fall into 2 categories:

1. Goods and services aggregators. Eg. AliExpress, Amazon, Craiglist.
2. Personal data and companies data aggregators. Eg. Linkedin, Xing, YellowPages. For such aggregators another name is business directories.

The first category of sites and services is quite wide-spread. These sites and services promote their goods with the goal of being well-known online, to have as many backlinks as possible to them.

The second category, the business directories, does not tend to reveal its data to the public. These directories rather promote their brand and give scraping bots minimum opportunity for data acquiring*.

Consider the following picture where a company’s data aggregator gives to the user only 2 input fields: what and where.

You can find more of how to scrape data aggregators in this post.

*You have to adhere to the ToS of each particular website/web service when you perform its data scraping. Review

The most successful enterprises are always the ones which manage to stay a step ahead of their rivals. And to remain ahead, you have to be able to access the industry information faster and more consistently than anybody else. This is especially true for e-commerce and online retail industries, where the pricing contest is extremely fierce. Thus, the smallest developments in information processes can result in large changes in the outcomes.netnut more…

Strategies on how to protect your data from cyber theft


Cyber-attacks are becoming a real threat to businesses both small and large. The damage they bring into people’s lives is more severe than people presume. In 2019, hundreds of billions of dollars went  down this tunnel, and the crime is yet to stop. With the evolvement of threat landscapes, attacks are becoming more and more sophisticated. It has also become clear that big companies need to understand that they cannot be 100% secure from such breaches. The real question is, if hackers manage to attack the big companies, how long would it take them to steal your data? The only way to handle this menace is if you understand these basic security strategies and implement them. more…

A revolutionary web scraping software to boost your business

octoparse-web-scraping-templatesIf you were an Amazon seller, would you want to know the listing price of a product of all competitors? Since you don’t have direct access to the Amazon database, you are out of luck and have to browse and click through every listing in order to construct a table of sellers and prices. A web scraping tool comes in handy. It automatically downloads your desired information such as product name, seller’s name, price, etc. However, web scraping that requires coding skill can be painful for professionals in IT, SEO, marketing, e-commerce, real estate, hospitality, etc.

It seems beyond one’s job description if he/she needs to learn how to code in order to obtain certain useful data from the web. For example, I have a friend who graduated in Mass Communication and works as a content marketer. She wants to scrape some data from the web, so she decided to learn Python herself. It took her two weeks to come up with a page of messy codes. Not only did she waste time on learning Python, but she also lost the time she could have used for doing her real work. more…

Meet Phantombuster – awesome tool for creating own APIs and extend audience via social networks

As you know, huge social networks are very useful instruments to improve business, especially IT-business. Developers, designers, CEO, HR- and Product-managers share some useful information, looking for useful acquaintances, business partners and co-workers. But how does one automatize the process of searching and attracting new people to your resource? With Phantombuster it’s not a problem at all. In our today’s article we will consider how to use the Phantombuster APIs in different areas. more…

Back to top