Extreme Gradient Boosting for Data Mining Applications

CHF 41.50
Auf Lager
SKU
ISVDOUOSCBM
Stock 1 Verfügbar
Shipping Kostenloser Versand ab CHF 50
Geliefert zwischen Do., 16.10.2025 und Fr., 17.10.2025

Details

Prediction models have reached to a stage where a single model is not sufficient to make predictions. Hence, to achieve better accuracy and performance, an ensemble of various models are being used. Gradient Boosting Algorithm has almost been the part of all ensembles. Winners of Kaggle Competition are swearing by this. Extreme Gradient Boosting is a step forward to this where we try to optimise the loss function. In this research work Squared Logistic Loss function is used with Boosting function which is expected to reduce bias and variance. The proposed model is applied on stock market data for the past ten years. Squared Logistic Loss function with XGBoost promises to be an effective approach in terms of accuracy and better prediction.

Autorentext

Nonita Sharma is currently working as an Assistant Professor in the Department of Computer Science & Engineering, Dr. B. R. Ambedkar National Institute of Technology Jalandhar. Her research interests include Wireless Sensor Networks, IoT, Big Data Analytics, and Data Mining.

Cart 30 Tage Rückgaberecht
Cart Garantie

Weitere Informationen

  • Allgemeine Informationen
    • Sprache Englisch
    • Herausgeber LAP LAMBERT Academic Publishing
    • Gewicht 113g
    • Autor Nonita Sharma
    • Titel Extreme Gradient Boosting for Data Mining Applications
    • Veröffentlichung 15.03.2018
    • ISBN 6138236122
    • Format Kartonierter Einband (Kt)
    • EAN 9786138236122
    • Jahr 2018
    • Größe H220mm x B150mm x T4mm
    • Anzahl Seiten 64
    • GTIN 09786138236122

Bewertungen

Schreiben Sie eine Bewertung
Nur registrierte Benutzer können Bewertungen schreiben. Bitte loggen Sie sich ein oder erstellen Sie ein Konto.