In this study, multilayer perceptron (MLP) and radial basis function network (RBFN) were employed to predict the population of microbial pathogens, chemical changes and sensory attributes of the beef slices. The chemical composition of Tanacetum parthenium essential oil (TPEO) was determined through gas chromatography/mass spectrometry. Disk diffusion agar, well diffusion agar, pour plate, minimum inhibitory concentration, minimum bactericidal/fungicidal concentration were used to evaluated the antimicrobial effect of TPEO. The contents of phytochemical and total phenolic compounds as well as the antioxidant activity of TPEO were also measured. Camphor with a percentage of 44.2% was the major compound of TPEO. The total phenolic content and antioxidant power of TPEO were equal to 151.2 6 2.10 mg/ml gallic acid equivalent and 57.25 6 0.2 mg/ml, respectively. MLP and RBFN are both capable of fitting the data and predicting. However, RBFN, due to its lower mean squared error, had a better performance than MLP.
Practical applicationsIn recent years, given the concerns about the risks of the consumption of chemical and synthetic preservatives, there has been a considerable tendency towards replacing them with natural preservatives. Lallemantia royleana seed mucilage (LRSM) is a native Iranian hydrocolloid which can be utilized in producing edible coatings and in the formulations of food products. Feverfew is a valuable medicinal plant which is used in traditional medicine for the treatment of inflammation, pain, fever and infection. In the present study, LRSM and LRSM 1 1% TPEO extended the shelf life of beef up to 3 days, whereas LRSM 1 1.5% TPEO and LRSM 1 2% TPEO resulted in a significant shelf life extension of the samples by 9 days, as compared with the control. These results suggested that LRSM coating combined with TPEO could be used as an effective natural alternative to improve the quality of beef during refrigerate storage.
| I NTR OD U CTI ONNeural networks are suitable tools in functions approximation. A multilayer perceptron (MLP) is a feedforward artificial neural network that maps between numeric inputs and targets. A two-layer MLP with a sigmoid function in the hidden layer and a linear function in the output layer can fit multidimensional mapping problems provided that the data are consistent and there is sufficient number of neurons in its hidden layer. Levenberg-Marquardt (lm), Bayesian regularization (br), and scaled conjugate gradient (scg) are the three famous algorithms which are used to train MLP. lm usually needs more memory, but less time. In this algorithm, when generalization is no longer improved, training stops automatically, which is shown by a rise in the mean squared error (MSE) of the validation samples. br typically takes a longer time. At the same time, it can lead to an appropriate generalization for difficult, small, or noisy data sets. Training stops based on adaptive weight minimization (regularization). scg requires less memory. Training automatically stops when generalization ...