PORTO-
FREI

XGBoost. The Extreme Gradient Boosting for Mining Applications

von Sharma, Nonita   (Autor)

Technical Report from the year 2017 in the subject Computer Science - Internet, New Technologies, grade: 8, , language: English, abstract: Tree boosting has empirically proven to be a highly effective and versatile approach for data-driven modelling. The core argument is that tree boosting can adaptively determine the local neighbourhoods of the model thereby taking the bias-variance trade-off into consideration during model fitting. Recently, a tree boosting method known as XGBoost has gained popularity by providing higher accuracy. XGBoost further introduces some improvements which allow it to deal with the bias-variance trade-off even more carefully. In this research work, we propose to demonstrate the use of an adaptive procedure i.e. Learned Loss (LL) to update the loss function as the boosting proceeds. Accuracy of the proposed algorithm i.e. XGBoost with Learned Loss boosting function is evaluated using test/train method, K-fold cross validation, and Stratified cross validation method and compared with the state of the art algorithms viz. XGBoost, AdaBoost, AdaBoost-NN, Linear Regression(LR),Neural Network(NN), Decision Tree(DT), Support Vector Machine(SVM), bagging-DT, bagging-NN and Random Forest algorithms. The parameters evaluated are accuracy, Type 1 error and Type 2 error (in Percentages). This study uses total ten years of historical data from Jan 2007 to Aug 2017 of two stock market indices CNX Nifty and S&P BSE Sensex which are highly voluminous. Further, in this research work, we will investigate how XGBoost differs from the more traditional ensemble techniques. Moreover, we will discuss the regularization techniques that these methods offer and the effect these have on the models. In addition to this, we will attempt to answer the question of why XGBoost seems to win so many competitions. To do this, we will provide some arguments for why tree boosting, and in particular XGBoost, seems to be such a highly effective and versatile approach to predictive modelling. The core argument is that tree boosting can be seen to adaptively determine the local neighbourhoods of the model. Tree boosting can thus be seen to take the bias-variance trade off into consideration during model fitting. XGBoost further introduces some improvements which allow it to deal with the bias-variance trade off even more carefully.

eBook (PDF)
ebook-Hilfe 

ebook-Format   ebook-Format ebook-Format ebook-Format ebook-Format   ebook-Format

EUR 18,99

Alle Preisangaben inkl. MwSt.

Auch verfügbar als:

Sofort per Download verfügbar

 
 

Produktbeschreibung

Technical Report from the year 2017 in the subject Computer Science - Internet, New Technologies, grade: 8, , language: English, abstract: Tree boosting has empirically proven to be a highly effective and versatile approach for data-driven modelling. The core argument is that tree boosting can adaptively determine the local neighbourhoods of the model thereby taking the bias-variance trade-off into consideration during model fitting. Recently, a tree boosting method known as XGBoost has gained popularity by providing higher accuracy. XGBoost further introduces some improvements which allow it to deal with the bias-variance trade-off even more carefully. In this research work, we propose to demonstrate the use of an adaptive procedure i.e. Learned Loss (LL) to update the loss function as the boosting proceeds. Accuracy of the proposed algorithm i.e. XGBoost with Learned Loss boosting function is evaluated using test/train method, K-fold cross validation, and Stratified cross validation method and compared with the state of the art algorithms viz. XGBoost, AdaBoost, AdaBoost-NN, Linear Regression(LR),Neural Network(NN), Decision Tree(DT), Support Vector Machine(SVM), bagging-DT, bagging-NN and Random Forest algorithms. The parameters evaluated are accuracy, Type 1 error and Type 2 error (in Percentages). This study uses total ten years of historical data from Jan 2007 to Aug 2017 of two stock market indices CNX Nifty and S&P BSE Sensex which are highly voluminous. Further, in this research work, we will investigate how XGBoost differs from the more traditional ensemble techniques. Moreover, we will discuss the regularization techniques that these methods offer and the effect these have on the models. In addition to this, we will attempt to answer the question of why XGBoost seems to win so many competitions. To do this, we will provide some arguments for why tree boosting, and in particular XGBoost, seems to be such a highly effective and versatile approach to predictive modelling. The core argument is that tree boosting can be seen to adaptively determine the local neighbourhoods of the model. Tree boosting can thus be seen to take the bias-variance trade off into consideration during model fitting. XGBoost further introduces some improvements which allow it to deal with the bias-variance trade off even more carefully. 

Mehr vom Verlag:

GRIN Verlag

Mehr vom Autor:

Sharma, Nonita

Produktdetails

Medium: eBook
Format: PDF
Kopierschutz: OHNE KOPIERSCHUTZ
Seiten: 52
Sprache: Englisch
Erschienen: März 2018
Auflage: 1. Auflage
ISBN-10: 3668660603
ISBN-13: 9783668660601

Herstellerkennzeichnung

GRIN Verlag
Nymphenburger Straße 86
80636 München
E-Mail: ab@grin.com

Bestell-Nr.: 28953340 
Libri-Verkaufsrang (LVR):
Libri-Relevanz: 0 (max 9.999)
 

Ist ein Paket? 0
Rohertrag: 2,66 €
Porto: 1,84 €
Deckungsbeitrag: 0,82 €

LIBRI: 0000000
LIBRI-EK*: 15.09 € (15%)
LIBRI-VK: 18,99 €
Libri-STOCK: 1
* EK = ohne MwSt.
P_SALEALLOWED: WORLD
DRM: 0
0 = Kein Kopierschutz
1 = PDF Wasserzeichen
2 = DRM Adobe
3 = DRM WMA (Windows Media Audio)
4 = MP3 Wasserzeichen
6 = EPUB Wasserzeichen

UVP: 0 
Warengruppe: 86360 

KNO: 00000000
KNO-EK*: € (%)
KNO-VK: 0,00 €
KNO-STOCK:

Einband: PDF
Auflage: 1. Auflage
Sprache: Englisch

Im Themenkatalog stöbern

› Start › eBooks › Computer & Internet › Datenbanken

Entdecken Sie mehr

Alle Preise inkl. MwSt. , innerhalb Deutschlands liefern wir immer versandkostenfrei . Informationen zum Versand ins Ausland .

Kostenloser Versand *

innerhalb eines Werktages

OHNE RISIKO

30 Tage Rückgaberecht

Käuferschutz

mit Geld-Zurück-Garantie