Survey finds Hadoop is too slow, big data is too complex
A new report from Paradigm4 reveals frustration from data scientists
Seventy-six percent of data scientists say that Hadoop is too slow, according to a survey from analytics company Paradigm4. Data scientists believe the open-source software framework requires too much effort to program and isn’t fast enough to keep up with big data demands.
Ninety-one percent of survey respondents say they are performing complex analysis of big data, and as a result 39% of overall respondents say their job is becoming more difficult. Seventy-one percent of respondents say the types of data sources in addition to volume are making analysis more difficult.
Of the 76% of respondents that report problems with Hadoop, 39% say it requires too much effort to program, 37% say it’s too slow for ad hoc queries, and 30% say it’s too slow for real-time analytics.
The ubiquity of big data
Big data is becoming increasingly important for all enterprises. Ninety-six percent of midmarket companies with 2,000 to 5,000 employees are embracing the rise of big data, according to research commissioned by Dell and conducted by Competitive Edge Research. Eighty percent of the midmarket believes they need to better analyze their data, as they believe big data initiatives provide a significant boost to company decision making.
For small businesses, free and cheaper tools are making it more likely that collecting and analyzing big data will become a necessity in order to compete.
The Paradigm4 survey features responses from 111 data scientists in the US. Survey results were culled during a one-month span in March and April.
Are you a pro? Subscribe to our newsletter
Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!