Technologies

Sentiment Analysis


Sentiment Analysis is the process of determining whether a piece of writing is negative, neutral or positive. A sentiment analysis system for text analysis combines Natural Language Processing (NLP) and machine learning techniques to assign weighted sentiment scores to the entities, topics, themes and categories within a sentence or phrase.

Neural Networks


The human brain unconsciously processes millions of signals a day, making it one of the most advanced wonders of the world. Humans have an extraordinary learning ability, being able to connect pieces of information and to interpret visual and audio signals. Researchers used the brain as a source of inspiration during the development of artificial intelligence and came up with artificial neural networks.

Text Mining


Text mining is the process of exploring and analyzing large amounts of unstructured text data aided by software that can identify concepts, patterns, topics, keywords and other attributes in the data.

Machine Learning


Most recent advances in artificial intelligence have been achieved by applying machine learning to very large data sets. Machine learning algorithms detect patters and learn how to make predictions and recommendations by processing data and experiences, rather than by receiving explicit programming instruction. The algorithms also adapt in response to new data and experience to improve efficacy over time.

Data Mining


Data mining is all about finding patterns and relationships in large datasets. The difference between data analysis and data mining is that data analysis is used to test models and hypotheses on a dataset, regardless of the amount of data. For example, we can analyze the effectiveness of a marketing campaign for different car models, or predict bicycle sales in the coming month. In contrast, data mining is used to explore hidden patterns in large volumes of data. For example, we can extract driving behavior from connected car data or find anomalies in online payments.

Simulation Models


With a Simulation Model a real situation is represented as a simplified mathematical model. In situations where there is a lot of uncertainty, different scenarios can be tested without implementing the scenarios in real life. The mathematical model should represent the complexity of the real-world situation. In this way, results that are gained on the simulation model can be translated to the real situations, and on the base of this, decisions can be made.

Web Scraping


Web Scraping is a technique used for extracting and structuring large amounts of data from websites. The purpose of Web Scraping is to automatically collect information from the internet and store it in a structured way.

Predictive Modeling


Predictive modeling is a commonly used statistical technique to predict future behavior. Predictive modeling solutions are a form of data-mining technology that works by analyzing historical and current data and generating a model to help predict future outcomes.