News

Gradient Descent finds the minima of cost function, by using the derivative of the cost function w.r.t parameters. Without applying Gradient Descent, we cannot train any model in Machine Learning ...
XGBoost is an open source machine learning library that implements optimized distributed gradient boosting algorithms. XGBoost uses parallel processing for fast performance, handles missing values ...