FAQ
Featured Questions
Which engine is Neuton based on for building neural networks (TensorFlow, Pytorch, Caffe, etc)?
The architecture of Neuton Neural Network Framework is not based on any existing solutions. The Neuton engine was designed, developed and implemented exclusively by Neuton´s scientists and engineers - from scratch - and the technology has been patented. Neuton AutoML is based on Neuton Neural Network Framework and automates the whole process of machine learning model creation including data preprocessing, feature engineering etc.
1. Which engine is Neuton based on for building neural networks (TensorFlow, Pytorch, Caffe, etc)?
The architecture of Neuton Neural Network Framework is not based on any existing solutions. The Neuton engine was designed, developed and implemented exclusively by Neuton´s scientists and engineers - from scratch - and the technology has been patented. Neuton Auto ML is based on Neuton Neural Network Framework and automates the whole process of machine learning model creation including data preprocessing, feature engineering etc.
2. How can I adjust the number of neurons and layers in the learning model process on Neuton?
Neurons and layers configuration are adjusted automatically – no actions are required on your part. However, you may limit the maximum number of coefficients if desired.
3. How can I be sure that Neuton's models are really compact?
The resulting number of coefficients in the trained model is evidence of how compact Neuton models are. In addition, one can also compare the size in kilobytes of Neuton Models versus resulting models from other competitors.
4. Does Neuton perform cross-validation during training?
Yes. Neuton performs 5-fold cross validation during training.
5. When using other traditional frameworks, I can choose when training can stop. Can I also control these parameters in Neuton?
In regard to the training control options:
Yes, you can stop the training if you are comfortable with the displayed validation metrics
You can limit the number of model coefficients, in which case training will stop upon reaching the limit value
You can limit the training time, in which case training will stop upon reaching the limit value
6. Is it possible to use Neuton in conjunction with other models, for example, with the sklearn library?
The trained model, downloaded from the platform, contains all information about coefficients, weights and other indicators, obtained during the training. With all trained models we offer scripts for calculating predictions in python, which can be easily combined with other algorithms.
1. Which technical skills are necessary to setup the infrastructure (GPUs, Virtual Machines, Storages etc.) for training models or making predictions using Neuton?
There are virtually no technical skills required. Neuton is a cloud-based SaaS solution. Infrastructure setup is absolutely seamless for users and everything is provisioned automatically.
2. Some Auto ML solutions require systems administrator’s expertise even to perform an initial set up of the system and scripting or software development skills to deploy a model or to make predictions. Do I need to have system administrator’s skills or scripting or coding skills to setup and use Neuton?
Absolutely not. You do not need any scripting or coding skills nor administrative expertise to set up Neuton. You do not need to have any scripting or coding skills to train your model or make predictions, either. In order to use Neuton you do not need to write any code or scripts, period. Everything is done through our Web interface and no special IT background is necessary.
3. I am not a Data Scientist, I don’t have any mathematical/statistical skills. Can I use Neuton without additional education?
We’ve worked diligently to ensure that the workflow is very simple, even for a beginner. Hence, we’ve also ensured that you DO NOT need to have the special knowledge in statistics typically required by existing AutoML solutions. Along with the straightforward user interface, we also offer online help, documentation and video tutorials in case you’d like to learn more, or some aspect of the process is for some reason not immediately clear.
4. When I download a trained model for local use what do I receive besides that model in a binary format and how can I find out how to use it locally?
When you download a trained model for local use you receive all of the following:
Example python scripts to predict on new data (for regression and classification)
Folders with dictionaries and python scripts required for transforming new data according to train set preprocessing during training
Readme files
A model viewer
How to use the downloaded model can be found in the readme.txt file within the downloaded archive or you can find it in online documentation.
5. Can I use my own validation data?
A user can upload both training and validation datasets separately to perform training. If a user does not have a validation dataset, cross validation will be enabled automatically.
6. Is it possible to upload a whole CSV file (test dataset) for predictions, and download a CSV file with all predictions?
Yes, you can get predictions on the whole CSV file and download a csv file with predictions locally. The file will include the original test data with a concatenated predictions column corresponding to each dataset row.
7. Can I send new data for predictions from my program/environment/web?
After a model is trained with Neuton, a user has the option to deploy the model to the cloud automatically with REST API access from anywhere in the web.
8. How can I interpret the Feature Importance Matrix?
Feature importance matrix implies the relative contribution of the corresponding feature to the model. A higher feature importance value (feature position is closer to the top of the matrix when compared to another feature) implies it is more important for generating a prediction.
9. What Model Evaluation Metrics are used in Neuton?
In Neuton there are multiple metrics for assessing model performance according to a task type.
For Binary Classification: AUC, Gini, LogLoss, Accuracy, Balanced accuracy, Precision ,F1, Recall, Lift.
For Multi Classification: LogLoss, Accuracy, Balanced accuracy, Precision(macro|weighted), F1(macro|weighted), Recall(macro|weighted).
For Regression: MSE, RMSE, RMSLE, MAE, RSMPE and R2.
You can select any metric according to a task type as a target metric, but Neuton calculates all applicable metrics for your convenience, as well as Feature Importance Matrix and Confusion Matrix (for the classification task type only). You can learn more about each metric in Wikipedia.
10. Can Neuton help me understand my data before model training? In some cases it can take hours to train.
Yes, before model training, Neuton builds an EDA report (Explaratory Data Analysis) which is unique for every dataset. The EDA provides the following information:
Dataset overview (dataset dimensions, data types, etc)
Continuous data distribution and relation to the target variable
Discrete data distribution and relation to the target variable
Feature correlations
Target variable distribution
Information about outliers
Time Dependencies
Information about missing values
11. If I want to look inside the «black box» of the Neural Network to understand the model behavior for a selected row in test dataset, should I generate the similar rows, changing selected feature with other features fixed, or there is an easier way?
Non need to go to all that hassle, there is, indeed, an easier way! Using the Model Interpreter allows you to interactively change the values of original features and see prediction result in real time. For continuous and discrete features the Model Interpreter builds graphical representation of their relation to the target variable, and for classification tasks you can see the probabilities of predicted classes on the graph. It also allows you to specify the threshold value and see the feature values for which prediction results are below or above the threshold. Furthermore, feature influence for continuous features will show you the trend in target variable value.
12. How can I understand the model predictive power besides the target metric value?
For a regression task type you can use the Confidence Interval showing the possible prediction spread with calculated probability.
13. Why is my prediction failing?
On occasion you may receive a “Prediction failed” message. This is generally caused by the data uploaded for predictions not matching the data that was used for model training. Both the number and position of columns should be the same, with the only exception being that the ‘target’ column should be excluded in the data used for predictions.
14. In cases where my data changes over time, how can I detect that I need to retrain a model on new data?
You can easily detect when a model needs to be retrained with the Historical Model-to-Data Relevance Indicator. It is based on data from the Model-to-Data Relevance Indicator. This metric aggregates the statistical differences between the data used for model training and all data the training data and all new data (batches and/or single lines of data) sent to the given model for predictions, over time.

Model-to-Data Relevance Indicator Value Range:
100% – The data has not changed. Model prediction quality will not decrease and will match the validation metric.

1% – The new data is materially different from the data used for training, that means that the target metric value may change significantly. A significant decrease in the Historical Model-to-Data Relevance Indicator`s value may indicate metric decay (model prediction quality degradation) and suggests typically means that that you need to retrain your model on new data.
1. I have a small (for example, 200 entry) dataset. Is it possible to build a Machine Learning model with Neuton using this data? For most algorithms - especially for gradient busting - it appears difficult to do with such a small dataset…
If in a small dataset the simulated pattern is reflected, Neuton will definitely be able to build an optimal model that can be successfully extrapolated to a large sample, even while for most other algorithms this will not be enough, as they tend to work successfully only on larger datasets. However, if there is no regularity within the data then no algorithm, including Neuton, can succeed in building a model. The minimal dataset size for Neuton is 2 column X 50 rows (for example, for a regression time series task type).
2. I know that tree-based algorithms perform poorly on very ‘wide’ datasets (10K+ columns). How does Neuton Neural Network Framework digest such data?
Neuton performs equally well either on a dataset with 2 columns, or on a dataset with 10K+ columns. Neural Networks naturally work great on ‘wide’ datasets and Neuton is no exception.
3. Does Neuton have limitations on dataset size and what is the recommended data set size for the most effective results with Neuton?
No, Neuton does not have any limitations on dataset size. Neuton works equally effectively with both very small and (very) large data sets - and everything in-between.
4. Is there a limit to the number of rows/columns in the CSV files?
There is no limit to the number of rows/columns in the CSV files.
5. Which data format is acceptable for training a Neuton model?
Neuton works with datasets in a CSV file with the following formatting:
Datasets must be CSV files using UTF-8 encoding with, at minimum, 2 columns and 50 rows.
The first row in the dataset must contain the column names and a comma, semicolon, pipe, caret or tab must be used as a separator.
Currently Neuton supports only the EN-US locale for timestamps and numbers, so you must use a dot as a decimal separator and delete commas typically used to separate every third digit in your numeric fields. If any numeric column is represented as a combination of a number and its corresponding unit, then only the number should be placed in the column. (E.g.: "$20,000.00" should be replaced with "20000.00")
Text fields should start and end with quotes in order to avoid confusing symbols in the the text with column separators. Neuton supports NLP, but can currently only process textual fields in English (US). If you used a language different from English for categorical fields in your dataset, you must transliterate them into English (US).
Other than that, no prior data preparation by the user is required. Our solution will examine each column and apply necessary preprocessing techniques if necessary, including:
data cleaning
categorical binarization and one-hot-encoding
dates preprocessing
label encoding
normalization
text field processing
processing time dependencies in data
etc.
6. What advanced techniques are used for preprocessing in feature engineering in Neuton?
Processing and feature engineering is performed respecting the time dependencies. User input gaps and forecast timeframe are accounted for during validation.
Time-series specific feature engineering includes:
Various date related feature extraction (E.g. day of week, day of month, time of day, etc.)
Data aggregation
Stacking ensembles
7. Does Neuton process text fields in data sets?
Yes, text data may contain additional information to make better predictions and Neuton can automatically detect and process text fields in your data, so manual preprocessing is not required. Neuton generates additional features during the feature engineering stage using the TF_IDF approach to build models with the best prediction power.
8. Does Neuton Support work with Times Series?
Yes, Neuton Supports work with Times Series to correctly process time-dependent data. When the user uploads a dataset with a date/datetime column Neuton builds a preprocessing and feature engineering pipeline with respect to time dependencies in data. Time-series specific feature engineering includes:
Various date related feature extraction (E.g. day of week, day of month, time of day, etc.)
Data aggregation
Stacking ensembles
1. Do I need to upgrade my billing account?
Yes, you need to upgrade your billing account to a paid account in order to use Neuton.
1. How does a Free Trial work?
This trial lets you try Neuton’s Gravity plan for free by using eligible free trial credits.
Register as a new customer within the Google Cloud Platform to be eligible for up to $300 in free trial credits. Corporate customers are also eligible for an additional $200 in credits on top of the $300 free trial credits. In order to be qualified to redeem the additional $200 credits, customers must register as a new customer with Neuton via a corporate email domain (no personal email accounts allowed, e.g. Gmail, Yahoo).
When your trial use exceeds your available credits, you will be charged infrastructure fees unless you stop or terminate the subscription. You may cancel the trial at any time by cancelling your plan.
2. What are the Free Trial limitations?
With this free trial you can make a limited number of predictions. Predictions via REST API are limited by 5000 API requests per month, and predictions via web interface are limited by 100 rows in a CSV file.
3. What happens when the Free Trial is over?
You will continue using Neuton Gravity for free. You will only be charged infrastructure fees.
4. What happens if I cancel the subscription before the Free Trial is over?
You will no longer have access to your Neuton account, which will be deleted along with your models and datasets.
1. Which Neuton plan is best for me?
Gravity - allows you test run Neuton, check performance of models trained with your data sets and do all of this for free, by utilizing Google credit programs. Gravity is best suited for basic Neuton evaluation.
Neuton's First Law - lifts limitations of the Gravity plan thus allowing you to use Neuton with your live applications or services. This is the best option if you're planning to use neural networks for production grade applications serving real customers.
Neuton's Second Law - the best choice if you're running a mature service with multiple customers and want to save money on operational costs without incurring increased upfront payment.
Neuton's Third Law - makes trained models available for you to download and use as your own. You can deploy models in your local environment, to a corporate data center or to any cloud providers – (where) you're in control of their availability and the amount of hardware they run on.
Enterprise solution - if you need special conditions in Neuton use, this is your solution. Contact sales for more information.
Embedded solution - if you need to augment your device with AI model capabilities this is your solution. Contact sales for more information.
2. How can I check what part of Google credit is still available for my use?
Open: https://console.cloud.google.com/billing.
Select your billing account in the page “My billing accounts” and click it.
In the open “Overview” page click “Credit details” in the section “Promotional credits”.
The list of credits assigned to you will be presented in the open page “Credits”.
You can check the amount of credit still available to you in the column “Remaining value”.
3. When will I be charged for Neuton services?
You will be charged on the last day of each calendar month.
This pay cycle applies to all Neuton plans.
If you select 1-year plans like Neuton's Second Law or Neuton's Third Law, the downgrade or cancellation of these plans will be effective after the end of 12th calendar month after subscription. Until then, you will continue to be charged for these plans monthly.
Your subscription fee for the first month will be pro-rated, calculated only to cover those days you were subscribed to Neuton service.
4. What I am paying for?
The full Google costs covered in the billing charges consist of the following:
Neuton subscription monthly fee for selected plan
Cost of trainings (depends on time in hours)
Cost of model hosting (depends on time in hours)
Storage cost (depends on dataset storage size (Gb) and storage period (months).
For more details refer to the links below:
Neuton subscription fees: https://neuton.ai.
Charge rates for Google usage:
https://console.cloud.google.com/marketplace/product/bellintegrator-public/neuton-automl.
5. How can I cancel my Neuton subscription?
To cancel your Neuton subscription you’ll want to go to the Neuton service page of Google Cloud Platform. Then simply click on "Cancel subscription".
Your cancellation will be effective depending on your current plan:
Gravity – instant cancellation during the first 7 days from subscription. From the 8th day, on, cancellation will occur at the end of that calendar month.
Neuton's First Law – cancellation will be effective at the end of that calendar month
Neuton's Second Law – cancellation will be effective at the end of 12th calendar month based on the initial starting date of the one year subscription (the first month of subscription can be partial)
Neuton's Third Law – cancellation will be effective at the end of 12th calendar month, based on the initial starting date of the one year subscription (the first month of subscription can be partial)
Please note that because the effective date of cancellation of the Neuton subscription is delayed through the end of the month (or calendar year for annual plans) you can use Neuton without limitations until the cancellation becomes effective. However, after the effective cancellation date all models and data will immediately be deleted.
6. How I will be charged if I change my plan?
The answer depends on the old and new plans:
Gravity – when upgrading from Gravity to any other plan, the change is instant and pro-rated charges for the current month will be consistent with cost of the new payable plan subscription.
Neuton's First Law –
Upgrading to a more expensive plan: access is immediate, charges will be recalculated proportionally.
Downgrading to cheaper plan – both access and charges will be adjusted to the new plan rate at the end of calendar month
Neuton's Second Law –
Upgrading to more expensive plan – access is immediate, charges will be recalculated proportionally.
Downgrading to cheaper plan – charges will be adjusted to the new plan rate at the end of 12th month after initial subscription date (the first month of subscription can be partial)
Neuton's Third Law –
Downgrading to cheaper plan - charges will be adjusted to the new plan rate at the end of 12th month after initial subscription date (the first month of subscription can be partial)
Stay updated, join the community
slack