There’s no single technology that encompasses big data analytics. Of course, there’s advanced analytics that can be applied to big data, but in reality several types of technology work together to help you get the most value from your information.

Here are the biggest players:

Cloud computing, Data management, Data storage, including the data lake and data warehouse, Hadoop, In-memory analytics, Machine learning, Predictive analytics, Text mining

Cloud computing. A subscription-based delivery model, cloud computing provides the scalability, fast delivery and IT efficiencies required for effective big data analytics. Because it removes many physical and financial barriers to aligning IT needs with evolving business goals, it is appealing to organizations of all sizes.

Data management. Data needs to be high quality and well-governed before it can be reliably analyzed. With data constantly flowing in and out of an organization, it’s important to establish repeatable processes to build and maintain standards for data quality. Once data is reliable, organizations should establish a master data management program that gets the entire enterprise on the same page.

Data mining. Data mining technology helps you examine large amounts of data to discover patterns in the data – and this information can be used for further analysis to help answer complex business questions. With data mining software, you can sift through all the chaotic and repetitive noise in data, pinpoint what’s relevant, use that information to assess likely outcomes, and then accelerate the pace of making informed decisions.

Data storage, including the data lake and data warehouse. It’s vital to be able to store vast amounts of structured and unstructured data – so business users and data scientists can access and use the data as needed. A data lake rapidly ingests large amounts of raw data in its native format. It’s ideal for storing unstructured big data like social media content, images, voice and streaming data. A data warehouse stores large amounts of structured data in a central database. The two storage methods are complementary; many organizations use both.

Hadoop. This open-source software framework facilitates storing large amounts of data and allows running parallel applications on commodity hardware clusters. It has become a key technology for doing business due to the constant increase of data volumes and varieties, and its distributed computing model processes big data fast. An additional benefit is that Hadoop’s open-source framework is free and uses commodity hardware to store and process large quantities of data.

In-memory analytics. By analyzing data from system memory (instead of from your hard disk drive), you can derive immediate insights from your data and act on them quickly. This technology is able to remove data prep and analytical processing latencies to test new scenarios and create models; it’s not only an easy way for organizations to stay agile and make better business decisions, it also enables them to run iterative and interactive analytics scenarios.

Machine learning. Machine learning, a specific subset of AI that trains a machine how to learn, makes it possible to quickly and automatically produce models that can analyze bigger, more complex data and deliver faster, more accurate results – even on a very large scale. And by building precise models, an organization has a better chance of identifying profitable opportunities – or avoiding unknown risks.

Predictive analytics. Predictive analytics technology uses data, statistical algorithms and machine-learning techniques to identify the likelihood of future outcomes based on historical data. It’s all about providing the best assessment of what will happen in the future, so organizations can feel more confident that they’re making the best possible business decision. Some of the most common applications of predictive analytics include fraud detection, risk, operations and marketing.

Text mining. With text mining technology, you can analyze text data from the web, comment fields, books and other text-based sources to uncover insights you hadn’t noticed before. Text mining uses machine learning or natural language processing technology to comb through documents – emails, blogs, Twitter feeds, surveys, competitive intelligence and more – to help you analyze large amounts of information and discover new topics and term relationships.