Guide your data science and machine learning projects through five key stages: define, transform, model, deploy, and act. Organize business stakeholders, data scientists, and engineers into project teams that collaborate to tackle the problem at hand. Team members share and annotate analytic assets such as data, scripts, and trained models within the project.
Quickly build and share drag-and-drop machine learning and ETL workflows with the Workflow Editor. Want the flexibility of Python for exploratory data analysis? Use the Notebook Editor to easily run Python against Hadoop and RDBMS data sources. Notebooks are managed and versioned just like everything else in Chorus, making them easy to collaborate on.
Schedule Workflows to run on an hourly, daily, or weekly basis. Push real-time scoring engines based on PMML to Cloud Foundry, AWS, or Google App Engine. Want engines that tackle data transformation well? Alpine is a member of the working group for PFA, the successor to PMML, and can bring you to the cutting edge of real-time transformation and scoring engines.
Models are only as valuable as the business behaviors they affect. Push insights from your scoring engines directly to the point of action with Touchpoints and the Touchpoints SDK. Touchpoints turn sophisticated data science projects into easy-to-consume UIs for employees and customers. Embed insights into existing applications or bring them to mobile using the Touchpoints SDK.