Ts through a sophisticated learning strategy. Using ACS we are able to pose some prototypical questions to the assigned dataset, after we trained the whole dataset using the 3 types of algorithms: AutoCM ANN (Eqs 1? and see Eq 18), Quisinostat price linear Correlation algorithm (see Eq 16) and Prior Probability algorithm (see Eq 17). ACS, therefore, works using simultaneously 3 different weight matrices. In detail, we posed two basic questions: i) Which are the prototypical variables connected to the AGA subjects?; ii) Which are the prototypical variables connected to the IUGR subjects?ACS Weights: Simple AlgorithmsThe matrix of associations of M variables from a dataset with N patterns can easily be constructed by computing the linear associations between any ACY-241MedChemExpress ACY241 couple of the M variables:PLOS ONE | DOI:10.1371/journal.pone.0126020 July 9,11 /Data Mining of Determinants of IUGRN X i;k ?x i ?? j;k ?x j ?k? Wi;j ?sffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi ; N N X X 2 2 i;k ?x i ?? j;k ?x j ?k? k??6??Wi;j1;i; j 2 ?; 2; . . .; MThe association matrix, Wi;j , is a square matrix where all the main diagonal entries are zero. The matrix Wi;j has, however, some limitations. It considers only linear relationships among variables, and it is not sensitive to the frequency and to the distribution of the variables across the dataset. To compensate these limitations, we compute another association matrix, Wi;j , based on the distribution probability of co-occurrence of any couple of the M variables: N X k? N X ? ?xi;k ??xj;k k?Wi;j ? n1 ?N2 1 ?Nxi;k ?? ?xj;k ??N X k?N X xi;k ?xj;k ?? ?xi;k ??? ?xj;k ?k??7??Wi;j?;x 2 ?; 1;i; j 2 ?; 2; . . .; MIf we scale linearly this new matrix, Wi;j , in the same interval as for the linear matrix, Wi;j , we get two comparable hyper-surfaces into the same metric space.ACS Weights: Complex AlgorithmsANNs represent an alternative route, to compute the matrix of the weights connecting the dataset variables. This choice yields two important results. First, we can define each weight taking into account global interactions among variables (i.e., the simultaneous associations among all of them), and not simply coupled interactions as in the association matrices above. Second, we work with nonlinear specifications of the algorithm, that allow to handle even extremely complicated relationships among the dataset variables. In particular, we considered the Auto-Contractive Maps [22]. Once the AutoCM has been trained, we can transform the trained weight matrix, wi;j ?, into a new metric as follows: f ??the function scales linearly the argument; ?1 Wi;j i;jx?; ?8??new Auto CM weigths matrix: ?f i;j ?WPLOS ONE | DOI:10.1371/journal.pone.0126020 July 9,12 /Data Mining of Determinants of IUGRActivation Competition System AlgorithmACS is a non linear associator, whose cost function is based on the minimization of the energy among units, whenever the system is activated by an external input. Details are below:M ?Number of Variables ?Units; Q ?Number of weights matrices; i; j 2 M; k 2 Q;k Wi;j ?value of connection between the i ?th and the j ?th units of the k ?th matrix;Ecci ?global excitation to the i ?th unit coming from the other units; Inii ?global inhibition to the i ?th unit coming from the other units; Ei ?final global excitation to the i ?th unit; Ii ?final global inhibition to the i ?th unit;.Ts through a sophisticated learning strategy. Using ACS we are able to pose some prototypical questions to the assigned dataset, after we trained the whole dataset using the 3 types of algorithms: AutoCM ANN (Eqs 1? and see Eq 18), Linear Correlation algorithm (see Eq 16) and Prior Probability algorithm (see Eq 17). ACS, therefore, works using simultaneously 3 different weight matrices. In detail, we posed two basic questions: i) Which are the prototypical variables connected to the AGA subjects?; ii) Which are the prototypical variables connected to the IUGR subjects?ACS Weights: Simple AlgorithmsThe matrix of associations of M variables from a dataset with N patterns can easily be constructed by computing the linear associations between any couple of the M variables:PLOS ONE | DOI:10.1371/journal.pone.0126020 July 9,11 /Data Mining of Determinants of IUGRN X i;k ?x i ?? j;k ?x j ?k? Wi;j ?sffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi ; N N X X 2 2 i;k ?x i ?? j;k ?x j ?k? k??6??Wi;j1;i; j 2 ?; 2; . . .; MThe association matrix, Wi;j , is a square matrix where all the main diagonal entries are zero. The matrix Wi;j has, however, some limitations. It considers only linear relationships among variables, and it is not sensitive to the frequency and to the distribution of the variables across the dataset. To compensate these limitations, we compute another association matrix, Wi;j , based on the distribution probability of co-occurrence of any couple of the M variables: N X k? N X ? ?xi;k ??xj;k k?Wi;j ? n1 ?N2 1 ?Nxi;k ?? ?xj;k ??N X k?N X xi;k ?xj;k ?? ?xi;k ??? ?xj;k ?k??7??Wi;j?;x 2 ?; 1;i; j 2 ?; 2; . . .; MIf we scale linearly this new matrix, Wi;j , in the same interval as for the linear matrix, Wi;j , we get two comparable hyper-surfaces into the same metric space.ACS Weights: Complex AlgorithmsANNs represent an alternative route, to compute the matrix of the weights connecting the dataset variables. This choice yields two important results. First, we can define each weight taking into account global interactions among variables (i.e., the simultaneous associations among all of them), and not simply coupled interactions as in the association matrices above. Second, we work with nonlinear specifications of the algorithm, that allow to handle even extremely complicated relationships among the dataset variables. In particular, we considered the Auto-Contractive Maps [22]. Once the AutoCM has been trained, we can transform the trained weight matrix, wi;j ?, into a new metric as follows: f ??the function scales linearly the argument; ?1 Wi;j i;jx?; ?8??new Auto CM weigths matrix: ?f i;j ?WPLOS ONE | DOI:10.1371/journal.pone.0126020 July 9,12 /Data Mining of Determinants of IUGRActivation Competition System AlgorithmACS is a non linear associator, whose cost function is based on the minimization of the energy among units, whenever the system is activated by an external input. Details are below:M ?Number of Variables ?Units; Q ?Number of weights matrices; i; j 2 M; k 2 Q;k Wi;j ?value of connection between the i ?th and the j ?th units of the k ?th matrix;Ecci ?global excitation to the i ?th unit coming from the other units; Inii ?global inhibition to the i ?th unit coming from the other units; Ei ?final global excitation to the i ?th unit; Ii ?final global inhibition to the i ?th unit;.