For the prediction of nonlinear phenomena in a three-wave Raman backscattering for laser amplification, a machine learning technology is applied to predict the generation of solitons in complicated multi-dimensional parameter spaces. The generation of the soliton in the resonant three-wave system is simulated with one-dimensional fluid equations. The solitons are generated in the early phase of the three-wave interaction, and the slow propagation speeds play an important role. Using a pattern matching method comparing the simulation data with the analytic solution, the generation of solitons are automatically detected. After collecting enough data sets by autonomous parameter scanning in the numerical simulation, nonlinear regression and k-nearest neighbor algorithms are utilized for the prediction of the existence of solitons.