Ensemble Learning

  1. Setiap saling model saling independen, yang dilatih dengan himpunan data yang saling independen.
  2. Masing-masing model memiliki akurasi leihh dari 50%
  1. Bagging (random sampling with replacement)
  2. Boosting (weighted based on accuray)
  3. Random Forest (several decision tree)
  4. Stacking

1. Bagging

2. Boosting

  1. Bagging merupakan boostraping dengan uniform distribution secara parallel sehingga setiap subset adalah independen. Boosting merupakan boostraping secara sequntial.
  2. Keputusan akhir pada bagging didapatkan dengan majority voting tanpa bobot. Sedangkan keputusan akhir pada boosting dilakukan dengan pembobotan tertentu.

3. Random Forest

Refference

  • Suyanto. 2018. Machine Learning Tingkat Dasar dan Lanjut. Informatika. Bandung.

--

--

--

Data Scientist Jagoan Hosting. Visit my website at www.arofiqimaulana.com

Love podcasts or audiobooks? Learn on the go with our new app.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
A.Rofiqi Maulana

A.Rofiqi Maulana

Data Scientist Jagoan Hosting. Visit my website at www.arofiqimaulana.com

More from Medium

Unsupervised Learning Series : #1 Overview of Dimensionality reduction

Boosting

The Curse of Dimensionality — Dimension Reduction | Dataloop Blog

KNN (K-Nearest Neighbor): Classification