Commit 15394505 by 20200116038

clustering_analysis-master

parent f03df26e
{
"cells": [],
"metadata": {},
"nbformat": 4,
"nbformat_minor": 2
}
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# 随机森林\n",
"## 1 什么是随机森林?\n",
" 随机森林就是用随机的方式建立一个森林,在森林里有很多决策树组成,并且每一棵决策树之间是没有关联的。当有一个新样本的时候,我们让森林的每一棵决策树分别进行判断,看看这个样本属于哪一类,然后用投票的方式,哪一类被选择的多,作为最终的分类结果。在回归问题中,随机森林输出所有决策树输出的平均值。随机森林既可以用于分类也可以用于回归。"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## 2 随机森林有什么特点\n",
"1)对于很多种资料,它可以产生高准确度的分类器;\n",
"\n",
"2)它可以处理大量的输入变数;\n",
"\n",
"3)它可以在决定类别时,评估变数的重要性;\n",
"\n",
"4)在建造森林时,它可以在内部对于一般化后的误差产生不偏差的估计;\n",
"\n",
"5)它包含一个好方法可以估计遗失的资料,并且,如果有很大一部分的资料遗失,仍可以维持准确度;\n",
"\n",
"6)它提供一个实验方法,可以去侦测variable interactions;\n",
"\n",
"7)对于不平衡的分类资料集来说,它可以平衡误差;\n",
"\n",
"8)它计算各例中的亲近度,对于数据挖掘、侦测离群点(outlier)和将资料视觉化非常有用;\n",
"\n",
"9)它可被延伸应用在未标记的资料上,这类资料通常是使用非监督式聚类。也可侦测偏离者和观看资料;\n",
"\n",
"10)学习过程是很快速的。"
]
},
{
"attachments": {
"1.PNG": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAPMAAAAwCAYAAADEikwlAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAAFiUAABYlAUlSJPAAAAr9SURBVHhe7Z3tZyNrFMDvP5hPo4QSasdyYxllo2wsG5dYGqUpK6VRK/3QuFZKox+yVEqFSqmUK0uFlaVSSyjDCuHcc56XaTKZyVsnTTs9P4bNdDKZmec57+eZ/QsYhgkFLMwMExJYmBkmJLAwM0xIYGFmmJDAwswwIYGFmWFCAgszw4QEFmaGCQkszAwTEliYGUbT70JtOw7Jk47aETQtKP5tQf4/W30OlpmEuXuSgEgk4rOloHqvDmRCgX2eBmNwjN+VoK3+Fj5saHyJQexLA/+1QOwaZIwY5K6C/5X5LPNNAWJigA3InC/01plnQOtrTAizsdtQe8JH5zgBxloOGn/UjgVin2fAMND4ddWOgJhLmO3TlNLWbI3DTw9qm9IyZ857al/IuCtDwnhKw9SFykdUjp+q+K/gmEuY69vK7bJKsKjognkuNCBn0HgnoHyndoWKHrrXUYis5qHZV7uegN5VDqKRKMbPakcAzCHMbSi+lcIc3W+qfUxo+VkEkxQ3TXa1K1R0K5DE+4t/e+JsQL8J+VV8rmidg/IHZhfm+yqkaHBxC63bxTg4IdV2Xe0JFxQrRyImFH+qHU9Icx89ggBD1dmF+TIrBzdiQelW7WNCiw6pUqdhTHTaUP2E92fkMJhYAkqWgjKKMwtz+9CUwhxWt4sZQIdUEyzXfRuqBymIrxpibhirccgct8a7j/ctqOwkIErx+Io6Hs/TuG7P53baTShsRMGIGGB+rkKH4t+7OuTFPrqmBBSuXWf+U4MMzeVpXN2+Da3jDMRX6HkYEN+sQBt/o3uZhwTdtxGFxEFztmunxBtdW0BVghmFuQvlDboZ3DZrwE52yPktJ9s4y2Vf58FCgYyly9BUqdn2SQqiOOETx97pUfumCAnxHRQImkT3dcitGWDoRNtvedz0yCRW/BDjXmXt4h9TYL5JQ/lGilf7G7nTMcj/EB8lKh8wOfejatBbqCToekWtGK9/3YLYWg7q6CZ3TpJ4/ujw+SfSgBw93/flQBLJswlzvw5Z+nHc/AaKwSHalRbqMZt52FJnWx6984y8Hh/FbV/lRL+BsYmWbSgTrC26Rzx4S2Ug/Nu7orBsmu53EgbcP4/HJ/I4SaiQMsFrEucxUCkMhoHaCm4N3IsS/ElzuXuaAuN9SVp7QQfK7+n+DMjU8Gxa6eHn3LU6ZCo6ULLoe8G4+bMJs85sjtNA/Q5Ud/JQC7gg/lR0zrKQPX2kour30GW0H7X1hoRjOcgEjc9k76IAkVB6Nj/oyR6B7KXaJcD9G1LRpc+GHVJHccyRaBNJOuUqd44scR6TrPQgtyWw6PwDykImvybFrG0ovUMhvVIfCceoKWX1pwH5dROsndqMdWP9nLJQD2C8ZxLmh3bODNQ8O2XIHbFQO417OM8ddNl28R4W0G63MB6pPLwVh7Ya3vFyc08K+pClc9DfdSmC//LofnvPH604dKLNvi5CZsMC820MDIypUwd16PpM+N5tAxq/6HsqoeVlIZ3E7YMV1MI8JKgjdKF10Rr+bW3UJrnHPwpgGjEo3KjPI2hhDqaGP5MwO80iG2VPDURtatTb6ifKzcM4xjEyIUEPPPomCzXSbKjl0yp5ElmJ4QCq/csCNW1uDSfcC5Fn+zyLz8ycc7Og6DXZdAmS4mW3EPV1I4mPVdOJJdwGBaXxRY3xSMJpWHH0rnNgDfZI/yoJ13xix5RzXaPKwkncDgjgdMI8iv7epFi7R/mED0Vo+raILk2YVZHb9ybIHZmmXkfHjT5w+ywD5m7dFXstDxr82GtuikGBEkrXK9PrhFs+ySqMW6XCHoyZH5qNrCOXPdOKQ7nAwmgYFpR+yT8TUhFMSDDp6/JQFtrtH5y78wmzbm91ud5zsSxhdoJ8H21MLtTbIg7ZZEZildsypHcXvFplVmhi0OSaQ7k0915+AkxbMil4PWieFKCmJ5xOMvkkq7xdcJW5xW1ECLTwq0Rbc58Wdgy7p9MIng4DR+Jlp9FpOJs9Mg+nwbH+ruSeTWW1jq9X6s2SYmYnQeGjjSnmmbpepjKLYvBsfDif0fV5Jhb5AZp8s5YaFC8+AaZjT3X/wm0esB4/VOzrFTPSsSIx5rY22rMbtUKtA6k4BhtTeu6YWiiIcePxsCDELfA6U+520/WcHpvN/tOE4gcMR/6pyHvV1t+1LoF+w1Fet1XIrKNnt2pCYez8WVI2WycovLWxrD9PX67SGgnj4w28kUWY5H4X6ntJiKPrH/s7DdUbmXE0Y3Ec7Gl+UF5jODufJqHHR4ZCYtIPrmV2BNYdT8t6LLmgowv8tbC5hFmXqsY1plAOAxXB+JhZW0yXpRX5D7pWj6y7UkrjYl+nndWQz6L9LS4/Y0zvQH3Wa9rqYxi5gZbWlvdr7Pmf2/FWnrTO3G9BQcS5uP3jEUOpeGi4DDEerS3dJYqgaO5bkLuU55blCqpDtqD4Bl3gwYEYA8VuI/HdqwAn4haFCnEoXMmGjtTpsCR00KWlbqvEUVt6Eb0OVLdNMAwT0n6lPRQeqksn1KIG+wbDKxI0mlc+LrtWEMZGEVrjpoouPRmGqAmLGvY9jjeVwlaSUB6Ivx2m6ACT1tuE7AXe/30dsmsWWBaeU9ede22opE2wDprSKt8UwCLlIBSeqkP78ZQdYGObH4a6gqSGmT4hgAO0iwNP5xnzIFtHKUh9mn4r6RiLHuTHiqPFRUJF/E4Hal8x9nOaCVBJ4UT1W4bW+ILfm1LwQ8d9AwrrNP4GxPe88xmdizwkY2qOrMQguVNxusD8cNofqZqBFqxylJUuu099meJakzrFJoQd2oJG9+vQPLDk3KJr2qtC27cyojyQcb3Z9Cqhnbg4n2gJpVf+OK2jcl/+YlR5CWM1Keey7N5sb2YT5s5xGo/FhyTcLtW5szBkrOYXAvTQHfLjVQvzE6ETbV7hDHWYJQYrHD+rUPVxxXXZaxbvkJBu8zRVmFmQLx+YVLpa/qopT6TATPMgaYDSSrB0AiL5fYHSLBIW8w3W63WzF0C/55HU07H5aFKMYumsa+FC5yjpkwDTZS88z6x93b9KEMc5GOh6ZnHOuCitkYX2dLXVembvppv5CEiYp0uAUYP9UAlKJ1J8EgBzu9mU/Pq3DE3yiMj9GnCjumdZyF/jxKIOo80EWHg93g/zNSfAAuauAklabWSkh6zQQ293bdiN/43zAsOf2FBzC8bNfsKqqyO+cfc4VH5gkks8A6JERtnuPioZSvB6NY2IbjjXwo9HEpAwS5dhbCBPXTzWcHO98yCVFgsKWW+kOqV0dxxlQWWwdRq0NpS+4gSiB+q7+otCh6Ddr9eJ0wasxwGtdPsiJ1ZbGRaOx5Ak67KY1+Yd29pnafn3rTlfoCCUQYDvACPltRpFBZSUMfYI6h1gbiX2SAITZlH492gaaeyjVtUtnEYUkif6CBtq2ybExPpQSiTgce9LEEirxF0VMh9SkEmjW3bZgurnOCQ2M5D8WICGsgw9lGCqXfomH3TTiPrIPIJ+B2o7SXyeKllGya93Kch/bz2u40/Xu13bPC4zJdqe9u2cwbcLByfMotb2giwZdfKs0ANFl9tDnikx8+TvhWKWiFqzPNgTvgie3XuzfaB1n+MWWjwrqPd4qwbdWm60S+eFLbRgAoLKUFsmJF7D/2gxmRe0BPKmCNZWHvKHbk38ApdAMgwSsDAj/HIChlkKwQszwzBLgYWZYUICCzPDhAQWZoYJCSzMDBMSWJgZJiSwMDNMSGBhZpiQwMLMMKEA4H9khDugzrIaIwAAAABJRU5ErkJggg=="
},
"2.PNG": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAiUAAAAzCAYAAABIfXhEAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAAFiUAABYlAUlSJPAAABSBSURBVHhe7Z3/hxvbG8c//2B+GiUsoRrLjRKlURqlcYlL43JTKkvXdaU/dF2VS9f1kVLpxwqVUlkqS4VKqVQJS6iwPJ/nOV+SmcyZb5lJdjL7vDjszmaTnDnnOef9POc5Z/4DDMMwDMMwKYBFCcMwDMMwqYBFCcMwDMMwqYBFCcMwDMMwqYBFCcMwDMMwqYBFCcMwDMMwqYBFSYaZf2xBqZCH/KNTmF6O4fRJGYqHRSgUKtD+NFevYhhmF7A9MkwwLEp2yGw8gMH7cKX3ug2to1ao0u5P1SfYGcHx/TaMrwbQzFlg/dKE/g/5l+GzHOQe92Bfh8HZWRNKNLCr35NmflaHUr0H0yt1Yctsuz7jlyUo/zna2/beDjMYG+zOXHpw+pfZ9tylDf3v6iMcsD1uCvffYLLUBqFFyeJ9E6wcGo+pWC0Y0osue1Az/V2UPBxfiLe6sUxelVb347BmGND8ShNq94pQPLBs91SVg2MYrU+gn9uiE8HFMeTp3n9S15HBH/g/T/qwUL/vE+RtFm5jf9uydUzQCK0HaOQRhMns34q7bZalBr1L9UIbu6nPHPpPLCg8G/LAvmQCnbur9ik+NtmcT3lag/JhEfKWvY1lyT9Hu1uH7TEGEfvv+ASKjjZpQH8fb25IsjaGRI+UoHEVVGPX33p8PeENyNfkj4RcYYirKZw+0KKiAK2PGzbvAr28d22o3dHv5Rzk7Ez/KUvRon4XHttBDqpvZur3PeLbKVQsvG/nOxhhriZwgpNWgSaSqCxtxILGmU8b77I+8z40LAtqb/ew3beFuP9ynMrFGNQXszH0XtSgqN/L5CQo2B43ZIP+O39bk+2BzkVme30Gx5DIomTZ0LkKnKrwowtbxKRxlmGJugkJDYSCqzmMX1VFBMv63eRpzaH3GP+G6nbJF/IiqtDdOyuVgs5CjzLOLYvEJ/JqC5EjfCsbMUdHJLuvz+xNFXIWfqfMjtDRmb6uLCPAsb3AyzF0HpGjgGLU6JqzPcYhav+dvCyKdi2+nKgrWSObY0hkUSJCjWTEhyfg2dTnLWXoPsLlBjM/aywHQutxL7aKFwOr1YD+T3VB8xOVLX6GXRiO/sxD7lEXP3MCnTKKInU97Sw+tlxh7+2DRn8f20ncr/AsbaTcwXcwcy31+TmEFnrlxuWFG4sMS4v2QjER2wtU0VCjk8D2GI9I/XeG7SDbtPVRXcoYWR1DIoqSCZwcqgH3j4G65kaEKOk1fsLlRjOH4bPCciCsvI6bniQH1tr6cppYv7YLQznJVv6diQ4t1rj3ghl0H+G9utvZeX8SngEZfuhoycpG8p739/rqM3qOk6BvBOcGMseB9rZss5yF9vJNXd8UEeY23GO2x9iE7r9KAGa3r2d3DIkmSmzLMmRIZhY4QcrX+AmXGw8pTj0Q4kDViTsQjrvQ+u/Y4Z1R4qWF3rq9045flSF/UITyk93tLonN91Oo4H26ljCs6vPeAmONMEuX11kfMTHm3AL2hrM4by1z5XL3O7FtY/ymBd3PzvZne0yAsP1Xvc4vWrnXZHgMiSZKPjTVJOrnOcrELXodD3wBfOuIjiXuKeWXrC+/MAKZo2FB61xd2ClT6JSxfcJG/ZY2UvYUmtdaH52Evqe7PbbJ9J/V7qnC0ZDvjwf70H91tD60M7FnZHkMiSRKdOKQ2GLlNYEqBZfLFeHki7rGeJJoot0eMH3bgNItrO+tsjwwipJ1X6traGSlJ12YODxGHXkLGS78PoDjB3mZ/HtQgfY5fsacDqoqyWu/NKAXMSolzpEImR+1tBHHDgs7110flSejt/EzK5LaHbc3zGH0oiK2NVt36rIf4T0YPJfXclYeKi9wTIphj/PPp9D4Rd5T6qvdr3jxh+7TFuQftGEU6TaH6b+raL3vRgsceya0i/GutC+qb+nJKYz9vg/dnxc1OV7R/Xk+gNliCqP3Y5htFOnKahtsTgRRohOHsPgl8J015GvoC+9LOPJamUHvsR4I89D8kOGBkAQrJeTOVfTBqkDtUR7KRwMZur5EBX6AhuNI/lU5Gp6TvA3a2XSrAicX8r/prJGcVYZy2YLKPxO0TrVkFvGgKhKO1D7BCXM2G/H0Iq6/PkuRZTzk64Yz60FN7447aMIgw+YoEiXvnqAToDzfu1Wo3SlC/fVYToJfZSTXuS0+fP+V52eQaCZLUAnFt7H/3kbBR+McjgdVfP+o0Yzg/ot2IdrQx5G4HMIx2lHudh1OlX3B11Oo0fjjdT7Rjz400N6s8jEM6V+uJtChpGaLxu/NohbZbYPNCS9K9E0LW/bhhMIfOqoTo1h1nGTV+23KXE7G8v2yO1nQTgOZta2UNtZ3PTokO3sJOqTmCd3v7gecVng1guPba6IODVLcU5UMNjxS4i/quQVqSSYwIdlmI56vTUF9dGg70q6EtNjKDphje4t8BCxiglLXs4XcnizPR8FJXNR3Pelee8S2yHjY/ivEnTNXTot7uTNpJeCtiGdZBfZfsc0a39tryVWLeazXeqRBRzpdqQe2/3H0YXFsANVjk8TPDLdBDMKLEt3QvopwlU/inYBD4aomdL7EXY2aQu9pM2Lo2s1iPof5ZYwyj1sPiVC04v6qgTBzUSbqGyoXaTl5u5cB9Xba5aCglwMD1i8X/QZYa0a6vq48/W8Nioe16H1PiYHyP75DgM1GfHKuUlAfPTA1P6gLIUmLrWyfpHfHpRCRkK0m0m8dKFNdXZO4imja+3PI/kunVzvOY0Gctr2A4Z9lKN5rQj+ShxDcf/U5QeufL9GRafP2b/3ezk0a+F2PZH8ovhirawpt85sk1Ga4DeIQWpSsjtB2TyRL9M0i4eKhoKavq1D9NyEjpx0s91pbPl53d4z+1ANhDkrXkVW9VehZI2rdVWfGuzx8bYC2SIPuU8YBZsXi2xCGX+0dQa+7evdFCQ5SdQusuk9kT0coAr5DJBvZVn3QO6rfsqD+ztso9ICS+sn2p0HYhC1xBZCIVNH9plLKXn7czykMzyeizy8n8XVv2Rb5W/a5kP2XnvM1diyd6K3yAZHgBPqvnniN+SQ6smGPPNjQwt8RhZh1xRKHKU9S37vl8sfXHrR+LUOJHkFwKw+Vp13vHJUMt0EcQouSUAdC6XwSr1AWqsqST8hp8m8VPb+CTDq6VRBP0JQFr90qQe3v9YQf/EyKMKBi9L59e4Qj0W4fT3kMh+7Qrmja8myB6Abo4goFq1hXDgqrTqGLg0jjnY9xhRQlSxvxW07Zdn2+d6F2rwE9n0FnP0TJHPp/aPvfoNw7gTWfNjr205cjHqC3TwyfyTHHNYkvI3+2SWzT/qv/LygPInb/1dF688Sr62qOMqwSZO31059n+u4OAYR1rNJD8fQcdTmQSz638f8CIt/ZaoN4hBQlq2UZv7UncTohVdq4locN/js9C8SgXu2oMJarsqqB3Vv1SP1FPwo8tYjnpmQ0ZCzQhm9Y4sDJXwhS+8S7qQFqYw5adw1DKFGyshHfpLEU1Gc/REk6GP9VQK8aJ4SYy8TpRXvPbrFrnIw37L/aYTUvqUTDt//q3CfjxKvriv9rOmdrKfztOSU2oeI6d0vbvLx3MspiQfP9aobS39WVo+IgY20Qk3CixJbk5i0qVqF3o3ARITCfsLZChrHM24nl6ZpuBUzJSRsde/sDv5P2hDYtSSbvqUhJtrcGa8N39wUtah27b7QBBqyfrqONZl0gzD4PYBzV5VWixDenJJSNICmoj36vSOvBabOVXSAiJRnfGqz7rcuRlEmY1G6OnR8b9l8dCXBOznOYnA9hGjAnrOPXf5fReiUgpv02nF7ob4pjD/3NK+fLuLSzSsp3TcD6Xqh7N8O5i7bYOpY91NjhO3lnrA3iEkqUrJZlzCExge0kS5MqFGIjxC4BEQ7z2k6sGtg16NP1oJCUB7GT9yI2pjcyuW4rSa4XbSji4Nr+rH6Pw9UCpu9bUKbvqS5FwsvjpzV84XWsRb30kk5AhGD6rgHlwzIci6dlrrwbh9GIz9A7eyjhugrFO3nI1wOePxRi943DRvzOM9lWfeYjaD8sQvEgj4Oiv5VtmjmfHlvZAWK3xZYilkna42IKg6Pyxt9z2W/XvWedR7H+8LVQ/XcBo5e0FF+DrpgvdCRg7UBB+gwtABLqv9qxkXMQfa49B0uPMaZ5TEbyKdJRceQ8roTB+ufN39XFdbujsPjpnJukI+0fKclaG8QllChZLsv4TfwoDGTo3RzlENsXA8NGqtN4KUCvyUEoR3foa58g5WnFfWqwB4vzYyg/PIFRnElh1ofWowqUHtagQm0UMKl6sUwGXfMKpnQEN1537scnVJ/wFZ16sMlB6RW+Kxmt8Oqdg4/wZFQUhna3VCjyIUSSf/6O9gr8DDCUjQi2UR8aUOXWP7GlMSD/gc8pCWDLEcsk7HHWb0H1fgmqjyvCu99UlCy3lTvGXL3bxLRDJUT/XTqo6mnJXztQEr87D9uixH5p78n1X/k3NQeRbTuiHivh4RIYOH/R7kfLnhOi0NuEHf+jtwjjdb/VAxFlCcgpyVobxCVYlFyNoX2XvgAWn3DR+C/ZcLlcEwauBsAKo6cXuKXSK59EYewchMhQ9j7WO+3oA276IU4MvX6UoW0oSmRiGB04lIfmeznkU1SA8miKRwNXIrPuO/4JntILyD/qiEz3yasKFO6V8T1X53zMPuJEQCcmCiNC76cu34+2zQU91GppgF7tY7eRX4PO59lCfeY9qItzgSbQwe8hhIwnqv34RFcPZMSy8Ht/wxM6d4wK5W8mSvSSO9kjTkgUcaMTll+Sg2BB9bWpH4Xov8qTL/4xwHs4R5svQBn7r4U2JMboqwVM3tShWG5LYZZg/xX2TBPx25E42Mzl5ND9QoFvPUCbp8mMIr/vmihecPz5zeP5Q+K8D7RHrI8Yn74PoEUHr5G9+9wH6WgGjevZa4O4eIqS5dYoU9HLMCIp0/B3UeyDuKxEkOH45ZPQYCHDaKaGkGuF2wglbR11amfsRDrsZLs5ByKOKNERgBp0P6ER3JFeS/5uAzofvd9NRirQu/E5MXH+uQO1A/l+JfVws+WR9uLaKYzX+41aMpKHF3mhBg1D8vbSwzEVH4PdWn3EmnjAri29xTDievRNQUwkSSyh0nbmXdzgOKJEe9PocQ8+taGs+lbh4TH0vnhL6zD9d3bWXB3F/mKEo/fqOHV9PLvrHifRf68m0P2tKCb0/GOPdqRHNzxUuzxFfZvQ1ae6euA8qr0G7TdtubTisRuVHM0infzqJRo0WWyDmIRLdI1NGFGi1J9XPolaX5Mn0a2zp6JEhACTSaQjD8G1xfZbDxr3ilA4KEI7sd1JMUSJTtCKetCQCj0m/kRMYYB+EQtEDRq+O2qisqX6iEeKC2/HB35KsCcyYonjT+xbQ96mwbnahj3GESV6yd21qyQA7r/LPBDjuICOZg0F0fIZXpdD6NEzq0xwG7jYkSiRUQ7/5RvlRXuoL/ncj4p5DWsfl2/UuvVmYdc15pRzsD4I0nMZ6NkdUuxZm+xOMrK5KNGGHPVIYznI42cGLLNEA+/L75YUuRQxoXCu+osdmahmO/Y+EbZQH5F3otaPUWxVPMKvYtDxODjqRpNUxBKZYz+3XJG1LdljDFGil8ON22N9uUH992oBC8OEJHeyuHNTyNFsP11bBvrQ9JzAuQ3c7EiUqFC3X6Kr2pVhNC6xNc/yPglWGGaAx5sq1Lp1Eol0dH4LLYOsD4Kf21AmFW/vaDCGzuMaqviwpWM4gGpzUWLekhYOcew6beX7pC7EZii2JtNuFppEzIP6DLqPcAJBzyHqkBFE4vU5J4+L8rnooVte4l0Kf3cy8Q0nwYjl/EMLitivXB6syR4pedxodx7lqO/uhxuLEmXHXttjA7gR/ffnEI7FEvPaTikxH5EgwDHXLj7mYzhBRzN/Rx3gp0oe29u8zMJtYGJnoiRoS7DcleEOec4uOlA9KELzzGda+NgyrvmnlUTWrSlB6/2xWoP0DuMJT5+ytON8loMNRQklg4otaZtGHeTnWk+SOr13AcPnOGAUilBWORsuxPLOto4YT7g+YgDNQ+GwDI235pYRfYE8nGRuYDZIKmJJW3OfUyIh9XGvvLgt2OOmooQSG8V3bRg2JoThBvRfMUnjPTpowZCiAjjmzi5OoS5Oaa27nru23IHnKh4TPLeBkZ2JEmk87pDP/Kwp1GRBJfg4VCb+XKVnBwREQCgEluia/xYR69b5OnS/qbMbopSvIxi878LJ05pKeNTFaxCUnn6y90YaQnhRstrn7yzYFwxhUV+Eh4IerTi7Y8tcTeCEstATXrN1sMv6iCU+9NDPWJGskBHL/G9dmJrsLaBMLgYweHMCzV9LSoyo4ukgbcEeI4sStUxu/75UNlkGyHz/ncPo7xqUCzLCS8UqVKHxtyFBNBLcBn7sTpSgQdLTGQOPmY8MbZ9Mes1/S4gOZOiMcYvXICiSoeS9IYXb6F9c6/JNEiSXjOgP5TDt4mnNu6kPhWO3d+7GviIiliZ7ilk8kw/X7fHN/65x+SYZuP9eP1lrgx2KEuTiGAoJT2b79EA+emrj4H240nvXM143Fo9zxsWSGO10Ia//gQpBxuZ6RQkxO2tAkQSD+j1p5md1KNU9lnS2wLbrM0aBRfkMPKDboadWG2zJWHpoj6br5uJ17P9W7PGaRQnB/ff6yVIb7FaUINPXVe+E1aj8HELr3rYV4h7zvQvVgzwUD6vQ/hTzJv3oQl0sq9n294vf296nCjIMsyJBe5zRwVdkf3ppQT9V/S+2Rma/2bkooTDQ6EUTOl/iLuNMofe06Uo2YhiGYRhmP7kGUcIwDMMwDOOGRQnDMAzDMKmARQnDMAzDMKmARQnDMAzDMKmARQnDMAzDMKmARQnDMAzDMKmARQnDMAzDMKmARQnDMAzDMKmARQnDMAzDMKmARQnDMAzDMKmARQnDMAzDMKmARQnDMAzDMKmARQnDMAzDMKmARQnDMAzDMKmARQnDMAzDMCkA4P9k4UBrilysOQAAAABJRU5ErkJggg=="
},
"3.PNG": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAS8AAAAvCAYAAACyqUxsAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAAFiUAABYlAUlSJPAAAA1QSURBVHhe7Z3/ZyPbG8c//2B/GktYwtpRPrHEZaPcKBuXuDSW5rJSbq2rnx9a18rSWFcvlbJCZaksK0uFlVJZy1CGFcLzec63ySSZOXMmM9Nmdp8Xw3Z2mi8z57yfr+f0P0AQBJFDSLwIgsglJF4EQeQSEi+CIHIJiRdBELmExIsgiFxC4kUQRC4h8SIIIpeQeP0AOKM+9C/Nju7pEbQOWkbHUW8i32HzcN43obTbgSSfcHRcgvLrIbjyZyJfRIrX9LIJ1tYWbAUdVgsG7KK7LtSC/p8fBTj8zF+KyIjxm9L8fm/XAoUo/GhC7Rcb7MeW75nJ4/EhDGfyTTYI96oFxSc49hKrjgu9PQuKrwaxBGzytrx6r9Sx0wGHXfT5EApB/8+PGnTv+EsRCTD3vK6PoChvfv085FHP+tCU1xQOuKwR98FsAp0dJT5FaF2tOaunDowujqD2VL0WGp5P8v82hdsOVCz8jh+n8kRC3B40LAtq51xyYuFe1OV9suHoWp5c5rYNZTknqu/ivwcRjrF4uec1+aAq0PkmTy7j88Aa71MaXIQZfFKLe7+V1CuZuTB6U+Uet/WyB5vzJIVIW3u9VEM9598qRhHoDcXUluHrgrjfKgIJwvPAbDj5Is8RqWAsXv19OTG2T2Asz63wsSVDTI3AEZnhvm94Ib71oivClwRMTitgWQ3ofZcnHpgphouFLLzB7wNoPcZo4c+hPGHCBNplOSdQTMME3nlXEdfoBI5YC0PxGsPJtnxQ+315bhUvF6ATOCJDXBi8KopnsGVB5TRpwl3khGphaYJ7xYGzXfxez9qZjK3hn8yLipGL+o7hJr/PW1B+G36fPaOvEThiPczEyxcOVkLj9ikOdPmgNAJHZAzzIp7I54AecPtWnl+X0Rm0/hk9/MT7imExfif7OCOzKMM7Y6H2wkELWh/luRXmRl8ncMR6mInXh6acDLrK4RAO0fVm122Gpf6JuW3zic6fGct/bUjYlwSRc9UJRUJUscnQQ5pXHDXemmf0qeKeBUbiNT625YPS5D+kZaTE5GbA81X8eWzFbgXImsl5A0qP8LM9KsPRJ/xkrEBwKs+hQJX2zmC80KKhvHrDsO5rHw53CqLg8LgCRx/xPdwRdPZK4tx/G9Bd8Ugn0HmO72GUm/JFGboUyZXKAW9O3vBHwkC8HOjsyAdVboc2BU7fN8Q17OFvYG/Qz4cD3Rfzlofmhw2RL2bkWBHAlQlvqwK13QKUD/owYePmDj0g9OAXCw4y/GJ9Z/JMKKzq+qgCJ5/Fb4+PS/geZSiXLai8RZlxZViNr798Rwav2L2qQOerPBEKvoaq7GpSJJ7Rf56smZYIJlq8fL1bRkfAoPjp+Ka80ASHVccJLl9vXVwhBOL1TCZl9rD2AlHVk54OfrZlz1CISAnaN/KEGoNRIjAbwuGTJaFG74d/f5noHxxIQVfNpD5UKNi6kifC8PVumRyF13GqmIQp0eL15QRs/hB0+YZ5vis8oerC8H9NaH/JIPX7bQyTlNzyyUUTmufJ7eTUdcG9S3C46dwn3o0uJ5GFE5Z7Nw8GGycy/+MZxdWQSlXovNypSklE5KOmvQZYSwKnBEkJyOSfGtjbtcBxyEJtdm3zgzwRghdlJO15nE2g+8ch9JL2tATg3EwevsgSxfchnOwdwXBNIx0pXl6filG+CwUuxGpNTqtQfZeN8zw5baXoVUzROpfX71LfQIavVfvEFpSyqtYZ4cDocgQOE1BVrVvxgOb9U16rhxpf6KHpmN4OYHDjf24qNxU+Lv0o8YpqMfG8N6N8V5jHy9pacJyltVJgAfRqX+UkVMVnW2VrVNcwqpHi5fWpmOS7whKq6L2VQl3+MXR2bbCLckA8KoL9ywmM5P8yhn8VxUCwCmg1Gyud0DrxGh6XwH4qkrdsEBeeNqHHPqPThbpaz8fec1ueZ/B2A5aXkT/nnYXlQ1U4y8DSx0UJxYqn7uuf8gTHULxWmKnclFmi30y8fM2pJvmukDwdayhm4XKgdE370Nqerze1HuO/9/2rChw4+02OaTlfFteg6sQLvZ1nvrWsbE7J13bO61CQuTyruPye2cHu+zpGNUK85uGgpVmr6C2TCLREaP1eWtHLhXShAVt/touCFnInoz2vMbSfse+x6D26Fw2wD/rgBqg+G3zFHylXwdemptG4mgbKIwpoIfA8Fp/grCteKuVhmDA3Ei+jnkcXui/ENcGhLhuPBlV5eS8CUzH43aove8KLXcHA87ppQ2nl8+Hn3rOhdd/FHWaw1liepRcvX+I5XHzmlihQ4JwzqBqVioMFhnsNv7e0a/VMwkY1ML3vcduBOn7e0JdlA3/dXRW+4XdW1ah1jzQS9grpeW1Oy4TyiFbHhTKEC9VGnWHToJ75csLcue7DKGCiqOu1OS9vCZxGfHxFrkCB+4Qhs8kqFD6p8XWWr2UV0991oZaBeLFreMFk/gwmp/UHSpcIY1b9N556acVrITEZJg4+SxTUnMqbCwMqO0EIV9ufn2B5gXpkl7hRzss/ASIfPgOvSdBcmDhhHyn2poglQw+frPcR5hGxaiH39IuL912FkhEe1AQ96fJ2GQ55Hmnei7UgRvw9fJVMHybVRi8c1LUEed33weOHCbQukpmjvoNv/jFDVG9BX6sxJuLF5s3coLPCTv0BvXL+WWJ2KmjFywsHdf01npsfbIl4ctPU3ZeD2pLX8yS/wQ01S9grS1MEe0fvyQnE9XlfLcAGhZXK3lfp4RWBljyKCZ5nY2k1XJeipu3zmqc4Sm/wVZXXsmR4HTSmYYvWo/u8fOGgZqIpUQgucom+SdPwXTkQ4nqZ5I/0jszEyzPoT2yo6KKQJLDm4P0KlLaLUGS5uesO1LZtKBar0PE7JagjRn18PsLFazaCIx7G4aFx10d/qe77JvRXLJGwHObrunyDFK1X2TDMMRMvHDZs6xP8rPULs8fEihV5XpMmNu3DCbRhO3yIIpAFllWA5qV4FsxrYjm54Byk8kB0iXfRyFrYbfPc6PhNBSdLGV9z3vflXOGYelqHbuBYkcZN12GPUYYQRE1L0AwFri6uCS5yic8Z1Y7hoSIbFMvRed3ImBuLF0v8s8XuW3hPzKZETBy8F1XPy+XPnc1tpwf1R0sCznvngjQknBXxmq/ZCjhU+OfbmHD1QMvlTRYxIOIkiQevWBUEB3aM8qmZeKHVOrCFl2jonnJLHDdJvCnITvMF67YOMwwpUuo5EygPqQZnn7pQ5xsfWlB41oD2Vfg4Ed6Mfm2je92GGq+isSVGXT5+vKVI/FwHRmHiF7q2cYqTTlbmAg4V/s03Jgw4FgSRpSMMGmE9pKji57eNvSND8WLpE7nxZHiEMYL2ixrUjA80HvI3WSRV8bYZErlxnn/8PoTOQWexv4t7gTqvdxV9wj4x8cVLiGc51m4IZgl7lox0pAU3axfIrXjxJTAJdlT1wbaYXvEyTEOBIFSoomm9CURWx8KboBMSd1eJtYkrXspTjeOVGIiXKoR9k/nE3bPAUDo1uAep6bfbPPESOQLz0EuGBzFj3yjx8icjVQ7BpLKRy7BRVhbjGIxQ+BbJy7nMGKFAAOr+myWs/chqdJb7ed3LxovC8zQOG2WYGS+ZHSVeLAqpS0OjQvJs+//Ec9fc3zTCxrSJlbCXTYUqYW+KTrzc65PFlgiVyI3s/RFeY74S9qKymEpLxF1fhBTLZfo4oUAAIi2w3n3ly39YBS/tnVRllfN++vriJeyVpxrPGOnEy4XR8WJLhLivYe+RIGxED/3s7x4P33nxz9d1MDxuwFlmCfuUiNMqsa7rHipeGGpUyjj5FtQcLc1LNoGCy+VzmHufr+19eGUxaUvEbAqTy0Mo8zxRRJgWFQosw4pAfHO+qHsfhjAo2exhj15Bmi+qwbxVQs6f2C074eI1fluB8vIzVQY9Za+WOy5WHbqOzHMqp+S2A9X6YsWX5zRj9vFlLl7CckS44zf4ZfzLIdjShD3zPdiXxWvwGn9fLQmyClB9px6JC719G4pyYvJlF899lsKPalKVP246vLJYqKM1C+gZizpuhtC/PIOTP2oyua0OvXhHhgIevhaDhQN/N24tgP+hkfT/elDj/T0pFwOflRXRpDq9bIHN8oiy2FB4akPp2HQ0LokXRjSHC8uNqtBRxuOuB81tufxOvk/lTeCMiM308wlUf21AYxdD1OsBHO2UoLZXh+p+8H5tcZ2W7MULJYjtKxW5PCgBJgn7uLBmRN4vlAf4hPaLQkpH0ASLEwpkhGgBSaN3bb2/25gYvnVPll69LmzcQLjnZ2IEF7kH8UIwHCwari9bh9TFK2cLs2P9xeyLbuD5wCNgDU2cUCBLHPT6bBYiy5/X4SH/YjZrlg1dmJ2YfIkXCxkzWJidHrQlzo+BeShA6JHd8rQlTnZb4qRHdpsRpileaW1GSBCRzLLajDAn4pX1ZoR5YHo7Sm0nVYLIP1OYjHKwk2pCfgjxIgji54PEiyCIXELiRRBELiHxIggil5B4EQSRS0i8CILIJSReBEHkEhIvgiByCYkXQRA5BOD/9g455k9Xo4AAAAAASUVORK5CYII="
},
"4.PNG": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAT4AAABBCAYAAABIO8iQAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAAFiUAABYlAUlSJPAAAAtFSURBVHhe7Z3vZyPbH8e//2AejSWUUI0+qEtcNvoglq1LLBvL5kGlbK2v3get68rSWF9dKssqV5aV5UpZZaVUVgnLUCF8vuecOTOZbGbOfGYyaSed94uxkk5+nR/v8/l1zv6HAAAgZ0D4AAC5A8IHAMgdED4AQO6A8AEAcgeEDwCQOyB8AIDcAeEDAOQOCB8AIHdA+AAAuQPCBwDIHRA+AEDugPABAHIHhA8AkDsgfACA3AHhAwDkDggfACB3QPgAALnj/oVvOqHhZY+6px06+6dHvcsRTab6b5LbPvWvJ/oBAACkD0v4xu+rVCgUQq496v7UN5qYjqj3pkpFq0BWqUaNgxa15PWyQsWtOnVvxD03XapvFunw0nkJuE9G1K4E9a9zVd+P1V2Dt8XAv6vreZdsdRfwmPSoKcZ8YHsVLGp9lTfZ1H0e9HfnKr4dqLcC6RHP4vt2RCXVGRY1PvGHuP31iCpPxOue1Kh96UwgP/aXFpU2K1TZlO/doIs7/QfwAIhJWNeTbvuIrvzWuI/Ru4qemDXq3OongYErOlLjW1x1sUCEtGvvtb5no0X9kHvA8sQSPvt8Tw92ppUnGJ3XlVhalUMahGrlmM6e6Q6vtIXtAR6OAR1uOH1hHfT1c4t4lt/2CQ31c8DAzy7tqbkzs54X8Vl+Ly8IAZ/VEUv4vNWIKU7KkpP3b4rVK8JAdC0I02QD98BNmyqyz8TV+BQ29cbU2XXuQX8x+dx05k6hTCff9XML9Kml3eJwcQRpEEP4hnSy7XQKK+Zw06Gq6sQSL2anB0b4ZAP3weRTQ/VDoVChtoy7BjHtUVPdg/7iMjwu63ZtUi/MhfUWHZM4gjTgC5/PVI8c7NMRtZ8691rCZGdFA4V1WChUqfNDPwYPgufCWsJK188t8P2EymosGMQR+PC5sLsdYS8HM1t0DOIIUoEvfJ6pHj3YbdGBlrq3SIf/6icjUJ2+cUjIXz0kvsyuIUPrxXrRXzx8FrLJW/IWHWTHVw5b+DxTPXKwD6n9m9PJcSbG+EOdym8RL3pQ7i6ooSdo+Tg8ZdHft5z+RQCeh2chm7ylEXW0l2Rqe5AOTOGbBbMjB/t1m3Z0J1v7ELK14mtLW+oFan7Wzy0wi/VWT5F/5zCrgzV4S96iY1Hri34OrAye8PlM9ajB7i92RuA7HuZCcd5lvWDGVAOY1eZxLhSac/GqIVgXv1QMJIcnfJ6pHj3Y+/tuB6YV+LZp8GeT2t9XIKK3QxqlVCw9+tik5vmSFtB0QvZPe7krcTNN6OKl7jtDAH5Wy2koNL/pUvPNReh7JOJuQCcvjwy1oFllZiEvWxdp/3tEzXfD9MML9pCGaXXWmvQTS/hmlkjUrgrf5EkpMzU6rVHt/WpcqtFpK8Us8oT6BxXhpqxrWHpWQ8aK7z3tBNdy2uJ9fm9RfxW7b350qPZMfG7ScbXkwjK3p5zLbYeqaj7w4nuh4aEb/dv1w1T50krXvV62n+4BlvB5prrBEnDwpe1DExu+rTsL1y9WorA0d8ImmFgXO8/KVC7pifikROXfT8S7zxj8t+TErKwilbcb1P3ly5uEb3C8Q+Wtoo55WVTcatKFdEHGXapv+D5zWz8vuROTflMsDuuofb7CZU58Lzg7aYuFryQmUfAEn3xuifYqq/3aTpuWqenf+nh7Rnu6ba1SmSrHi58xOq3STsLgv/2pqT4/2VWhk2/6jeLAqYbwJZX2zgMGz1S0+2/hpV5Lt6tJ+MR4b4j3Lsktp/I7ijFf+cuZZYPjsm9+iPfVz0uW6af7gCF8sy1MnMJlz9UNFSwfvr2/i5NNWI+v5J7gCMNerC5qRQ1KuthiQD0TYhgiRNEWn5uhnrd07Y8NKh/0Avdbyux3aQ03lc9cWDHBwvbeRtVyysQWYwubYzUGF+kOj2vivQ3LqxQJa29hEcsqrLrIy0MqqnYNbpPJRUPVw0bMhOTtyrD41HcQ33HOG5gKI6ZSo861fuwn4/0ULXwsU32GVHpnAhk6WmOMF43PqMY6sCBYnGQRdeeFeascx9V1f4/324XLUT/ohycQZDxUWrsJzPzx/2peVjXplTS50T/QVqxJuDzrJVgcB2+EZc1Y5QMnkUBucaxHZoqdcErtwzooH68awksqBYqjs4+dM/cStyvH1Z3qUIg3PmwxZurUCY3jZ7ufIoVvVk1usAT8eOUs0dtuTC60EsVI19rBqTH0lwGITtmvRyZXWDE+v0Up41cvomIX4p6kGc8UkhuJ4lBiKHsBeEMJkme9BIYxpGfgHrMUgeva+b0CuaDsCytaPzShFqN1KPL1ubDh1RC+8FDQb1JWNnPuJW1XjvAJ/Bbl6LQuXmPugSz3U6TwmQd7EBPRQM5riq8NDS7dUEMwXVkg3DpAnXV2J61KiDBqzHjJDTfwXKLyrtmCdHDuD4zVZBWfCxv+vWcB+EDrRXkG4XGoedwkmL7fTYhwm0xM1DjF8Q+G58IaFkLXkhL3BVrLqrYy2ntySNiuTOGT90mLsri1Q3uM+ZXlfjILn/Th3V0Yf8RQbtngW3J1sKh6fLX4OvuKTnYtsp449yxaCU4HVt4xGleh45CykcVgqwgB5HxXnvAJZ+NDTbVB/SOvBaQly//uD4+MWToutsFKF+JYV/eE/Da1+PAz+W6YY+/8irr1msFlCkAlYrK/n/Xqzx31G43VEN+OvF0dQUkl5XHFOKotUbtyhW+q59nm0VwSMZQM91Og8HnxnqDLFKT1I8St88LJ+lgbVd+Jy1UqbuxQ41QIoozD7QYNCse6iLMzwDHDhZjGSKPzhE/GMnT2imm2qwQP11p9KLgnA7uHkgZcc3VpcnVnWyYCN4RglSNdpgXUa7nW5T3jy44vXq7LaqpsmD+SSrmLnEShS5J25QqfLKlRxgozlJPhfopObizL3ZAG//So+7FLPfFv/2rMiEPFFz4nQByvaJqX3JCxjLF2IWp0xgg6roXwpU1c4XN3A73u6SdikGXhS5nYwpekXTnC58a3vzlhpeIbhgOba+FLhBPw5buLOrYRM54QJXz+bJib5OFkqdbN1U0FFc+K4dbouGyiWOiauLppoFzXOKeSJ2nXKOGbq5DQiTA516Laf91c3SwQK7mhA8RxD0UwCZ8tVra5shVVlyQ6PHL1dazVtUpupEGs5Ia2ZJJaA9K6XIfkRhrESm4kbFeT8E2Ft/N6vmxl+LeMXTIOU8hwP2VW+OKUs7jZs7hiEyp8122qVk5oOLdSCavylYxv7FA7qGDTQ4gwo5Tn8RGjnEVb9EknhZrcjILeR0Gccpak7RomfFObevtlWviPxXTJmvXK3AdZ7qfMCp8TH4goYL7uUE1uJ/Jtxym/7PLEUvCr8PXfite729SsItXeu+UFtlj1Ztt2rA1x39N2cGbLLWDWD/OETDCZC5gn1DuQ27/8Wwl36CRWzaMT1siPRe14EOYC5iXb9Vfh+3FGe75talapRT3344UnVPFt5SyJ+5qBezSz3U/ZFT4hX93njC1rS8BJbsRFFlPv/G2a/I8YYXmXVv2/rqmQA2dHz+NhLLwfzpa1xJhc3aRkvJ8yLHwCOZHiZLRikrrwrfMhBakgF6vwQwrSQLpPWd78vhJk/dzmCrOjKxC+rPdTtoVPgGOp1gx3t8AqVvof2T/uaGXgWKpUybzwyfjaqg4iTVP4UjmI9LGAg0hXwsoOIk1T+B7TQaSPlcnNVWonMAOwtoyv6CrVVSr75Fr4AAD5BMIHAMgdED4AQO6A8AEAcgeEDwCQOyB8AIDcAeEDAOQOCB8AIHdA+AAAuQPCBwDIHRA+AEDugPABAHIHhA8AkDsgfACA3AHhAwDkDggfACB3QPgAADmD6P8ryaoSWU7fBQAAAABJRU5ErkJggg=="
}
},
"cell_type": "markdown",
"metadata": {},
"source": [
"## 3 随机森林的基础知识\n",
"### 1)信息、熵、信息增益\n",
"这三个基本概念是决策树的根本,是决策树利用特征来分类时,确定特征选取顺序的依据。理解了它们,决策树你也就了解了大概。\n",
"\n",
" 信息\n",
" \n",
"  引用香农的话来说,信息是用来消除随机不确定性的东西。当然这句话虽然经典,但是还是很难去搞明白这种东西到底是个什么样,可能在不同的地方来说,指的东西又不一样。对于机器学习中的决策树而言,如果带分类的事物集合可以划分为多个类别当中,则某个类(xi)的信息可以定义如下:\n",
" ![1.PNG](attachment:1.PNG)\n",
" \n",
" I(x)用来表示随机变量的信息,p(xi)指是当xi发生时的概率。\n",
"\n",
" 熵\n",
" \n",
"  熵是用来度量不确定性的,当熵越大,X=xi的不确定性越大,反之越小。对于机器学习中的分类问题而言,熵越大即这个类别的不确定性更大,反之越小。\n",
" ![2.PNG](attachment:2.PNG)\n",
" \n",
" 条件熵\n",
" \n",
" 条件熵是用来解释信息增益而引入的概念,概率定义:随机变量X在给定条件下随机变量Y的条件熵,对定义描述为:X给定条件下Y的条件概率分布的熵对X的数学期望,在机器学习中为选定某个特征后的熵,公式如下:\n",
" ![3.PNG](attachment:3.PNG)\n",
" \n",
" 信息增益\n",
" \n",
" 信息增益在决策树算法中是用来选择特征的指标,信息增益越大,则这个特征的选择性越好,在概率中定义为:待分类的集合的熵和选定某个特征的条件熵之差(这里只的是经验熵或经验条件熵,由于真正的熵并不知道,是根据样本计算出来的),公式如下:\n",
" ![4.PNG](attachment:4.PNG)\n",
"\n",
"\n",
"### 2)决策树\n",
"\n",
"  决策树是一种树形结构,其中每个内部节点表示一个属性上的测试,每个分支代表一个测试输出,每个叶节点代表一种类别。常见的决策树算法有C4.5、ID3和CART。\n",
"\n",
"### 3)集成学习 \n",
"\n",
"  集成学习通过建立几个模型组合的来解决单一预测问题。它的工作原理是生成多个分类器/模型,各自独立地学习和作出预测。这些预测最后结合成单预测,因此优于任何一个单分类的做出预测。\n",
"  "
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# 随机森林的生成\n",
"每棵树的按照如下规则生成:\n",
"\n",
"1、用N来表示训练用例(样本)的个数,M表示特征数目。\n",
"\n",
"2、输入特征数目m,用于确定决策树上一个节点的决策结果;其中m应远小于M。\n",
"\n",
"3、从N个训练用例(样本)中以有放回抽样的方式,取样N次,形成一个训练集(即bootstrap取样),并用未抽到的用例(样本)作预测,评估其误差。\n",
"\n",
"4、对于每一个节点,随机选择m个特征,决策树上每个节点的决定都是基于这些特征确定的。根据这m个特征,计算其最佳的分裂方式。\n",
"\n",
"5、每棵树都会完整成长而不会剪枝,这有可能在建完一棵正常树状分类器后会被采用)\n",
" "
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"随机森林构建\n",
"\n",
"决策树相当于一个大师,通过自己在数据集中学到的知识对于新的数据进行分类。但是俗话说得好,一个诸葛亮,玩不过三个臭皮匠。随机森林就是希望构建多个臭皮匠,希望最终的分类效果能够超过单个大师的一种算法。\n",
"\n",
"那随机森林具体如何构建呢?有两个方面:数据的随机性选取,以及待选特征的随机选取。\n",
"\n",
"1.数据的随机选取:\n",
"首先,从原始的数据集中采取有放回的抽样,构造子数据集,子数据集的数据量是和原始数据集相同的。不同子数据集的元素可以重复,同一个子数据集中的元素也可以重复。第二,利用子数据集来构建子决策树,将这个数据放到每个子决策树中,每个子决策树输出一个结果。最后,如果有了新的数据需要通过随机森林得到分类结果,就可以通过对子决策树的判断结果的投票,得到随机森林的输出结果了。如下图,假设随机森林中有3棵子决策树,2棵子树的分类结果是A类,1棵子树的分类结果是B类,那么随机森林的分类结果就是A类\n",
"\n",
"2.待选特征的随机选取\n",
"与数据集的随机选取类似,随机森林中的子树的每一个分裂过程并未用到所有的待选特征,而是从所有的待选特征中随机选取一定的特征,之后再在随机选取的特征中选取最优的特征。这样能够使得随机森林中的决策树都能够彼此不同,提升系统的多样性,从而提升分类性能。\n",
" \n",
"**本实验中随机森林用于回归的具体步骤:**\n",
"\n",
"1、从训练集中随机抽取一定数量的样本,作为每棵树的根节点样本;\n",
"\n",
"2、在建立决策树时,使用最小方差作为分裂规则,即随机抽取一定数量的候选属性,根据候选属性和对应的特征值划分成两个数据集,两个数据集的方差和越小,该属性越适合作为分裂节点;\n",
"\n",
"3、建立好随机森林以后,对于测试样本,进入每一颗决策树进行回归输出,每一颗决策树输出的均值作为最终结果。"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## 使用随机森林解决波斯顿房价\n",
"接下来,我们会使用自己构建的随机森林来预测波斯顿的房价,这是一个回归问题。\n",
"该数据集是一个回归问题。每个类的观察值数量是均等的,共有 506 个观察,13 个输入变量和1个输出变量。\n",
"每条数据包含房屋以及房屋周围的详细信息。其中包含城镇犯罪率,一氧化氮浓度,住宅平均房间数,到中心区域的加权距离以及自住房平均房价等等。\n",
"CRIM:城镇人均犯罪率。\n",
"ZN:住宅用地超过 25000 sq.ft. 的比例。\n",
"INDUS:城镇非零售商用土地的比例。\n",
"CHAS:查理斯河空变量(如果边界是河流,则为1;否则为0)。\n",
"NOX:一氧化氮浓度。\n",
"RM:住宅平均房间数。\n",
"AGE:1940 年之前建成的自用房屋比例。\n",
"DIS:到波士顿五个中心区域的加权距离。\n",
"RAD:辐射性公路的接近指数。\n",
"TAX:每 10000 美元的全值财产税率。\n",
"PTRATIO:城镇师生比例。\n",
"B:1000(Bk-0.63)^ 2,其中 Bk 指代城镇中黑人的比例。\n",
"LSTAT:人口中地位低下者的比例。\n",
"MEDV:自住房的平均房价,以千美元计。\n",
"\n",
"通过完成作业,你将会学到: 1、如何构建回归树; 2、如何利用回归树建立集成模型(随机森林);3、如何评估模型效果。\n",
"\n",
"```不要单独创建一个文件,所有的都在这里面编写(在TODO后编写),不要试图改已经有的函数名字 (但可以根据需求自己定义新的函数)```\n",
"\n",
"在本次项目中,你将会用到以下几个工具:\n",
"- ```sklearn```。具体安装请见:http://scikit-learn.org/stable/install.html sklearn包含了各类机器学习算法和数据处理工具,包括本项目需要使用的词袋模型,均可以在sklearn工具包中找得到。 \n",
"- ```numpy```,数据处理库:www.numpy.org\n",
"- ```joblib```,这是一个可以简单地将Python代码转换为并行计算模式的软件包,详情见https://pypi.org/project/joblib/"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### 1.载入需要的包和数据"
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {},
"outputs": [],
"source": [
"#导入需要用到的算法库\n",
"import numpy as np\n",
"from numpy import *\n",
"import random\n",
"from sklearn.model_selection import train_test_split\n",
"from sklearn.datasets import load_boston\n",
"from sklearn.metrics import r2_score\n",
"from joblib import Parallel, delayed\n",
"import warnings \n",
"warnings.filterwarnings('ignore')"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# TODO:导入波斯顿数据并简单探查数据\n",
"boston = load_boston()\n",
"\n",
"\n"
]
},
{
"cell_type": "code",
"execution_count": 10,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"'2'"
]
},
"execution_count": 10,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"label_dict={\"1\":0,\"2\":3,\"3\":2}\n",
"list(label_dict.keys())[list(label_dict.values()).index(sorted(list(label_dict.values()))[-1])]"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### 2.构建自己的随机森林\n",
"随机森林模型整体是一个myrf的类,所有的参数都包含再类中的成员变量和成员函数中。\n",
"\n",
"需要同学通过补全各个模块的代码使随机森林能按照回归树的策略建立起来。"
]
},
{
"cell_type": "code",
"execution_count": 7,
"metadata": {},
"outputs": [
{
"ename": "SyntaxError",
"evalue": "invalid syntax (<ipython-input-7-653daea8a3b0>, line 40)",
"output_type": "error",
"traceback": [
"\u001b[1;36m File \u001b[1;32m\"<ipython-input-7-653daea8a3b0>\"\u001b[1;36m, line \u001b[1;32m40\u001b[0m\n\u001b[1;33m dataSet =\u001b[0m\n\u001b[1;37m ^\u001b[0m\n\u001b[1;31mSyntaxError\u001b[0m\u001b[1;31m:\u001b[0m invalid syntax\n"
]
}
],
"source": [
"class myrf:\n",
" # 存放树的列表\n",
" trees = []\n",
" # 随机种子\n",
" random_state = 0\n",
" # 树的个数\n",
" n_estimators = 10\n",
" # 最大特征数\n",
" max_features = 10\n",
" # 最大深度\n",
" max_depth = 10\n",
" # 切分新节点所需的最小阈值\n",
" min_change = 0.001\n",
" # 当前树的数量\n",
" cur_tree = 0\n",
" # 最小分割\n",
" min_samples_split = 0\n",
" # 叶子内节点的最小数目\n",
" min_samples_leaf = 0\n",
" # 每次建树时所用的样本占总样本的比例\n",
" sample_radio = 0.9\n",
" # 每次建树时所并行化处理器的个数\n",
" n_jobs = 10\n",
" # 计算y的方差\n",
" # 本来是要除总样本数的,考虑到对于所有的叶子来说,总样本数都是一致的,所以不除也可以。\n",
" def get_varience(self, dataSet):\n",
" return np.var(dataSet[:,-1])*shape(dataSet)[0]\n",
" \n",
" ## TODO:计算y的均值\n",
" def get_mean(self,dataSet):\n",
" \n",
" return\n",
" \n",
" # 根据特征和特征值,把样本划分成高于该特征值的部分和低于该特征值的部分\n",
" def SplitDataSet(self, dataSet,feature,value):\n",
" ## TODO:将数据集dataSet 按特征feature从小到大排序\n",
" dataSet = \n",
" ## TODO:将数据集按特征值划分成高于特征值的部分和低于该特征值两个数据集\n",
" \n",
" return\n",
" \n",
" # 选取最优的特征和特征值边界\n",
" def select_best_feature(self, dataSet):\n",
" #计算特征的数目\n",
" feature_num=dataSet.shape[1]-1\n",
" features=np.random.choice(feature_num,self.max_features,replace=False)\n",
" # 最好分数\n",
" bestS=inf;\n",
" # 最优特征\n",
" bestfeature=0;\n",
" # 最优特征的分割值\n",
" bestValue=0;\n",
" S=self.get_varience(dataSet)\n",
" ## TODO:判断样本数量是否足够,如果样本量少于最小分割,或者样本量少于叶子内节点的最小数目,就返回数据集的平均值结束程序\n",
" if :\n",
" return None,\n",
" \n",
" # 遍历所有特征,\n",
" for feature in features:\n",
" ## TODO:将数据集按特征feature从小到大排序\n",
" dataSet = \n",
" # 遍历数据集中的数据,控制叶子节点数目\n",
" for index in range(shape(dataSet)[0]-1):\n",
" ## TODO: 排除dataSet数据集中的重复值\n",
" if :\n",
" continue\n",
" #将数据集按index分为前后两部分\n",
" data0 = dataSet[0:index+1, :]\n",
" data1 = dataSet[index+1:, :]\n",
" #判断样本数量是否足够,如果样本量少于最小分割,或者样本量少于叶子内节点的最小数目,就跳到下一个循环\n",
" if shape(data0)[0] < self.min_samples_leaf or shape(data1)[0] < self.min_samples_leaf:\n",
" continue;\n",
" #将两个数据集分别求取方差并加和作为新的分数\n",
" newS=self.get_varience(data0)+self.get_varience(data1)\n",
" #如果最好分数大于新的分数,将新的分数赋值给最好分数,保证方差越小越好\n",
" if bestS>newS:\n",
" bestfeature=feature\n",
" bestValue=dataSet[index][feature]\n",
"# print(bestfeature, bestValue)\n",
" bestS=newS\n",
" #如果误差不大就退出,说明无法分割\n",
" if (S-bestS)<self.min_change: \n",
" return None,self.get_mean(dataSet)\n",
"# print(bestfeature, bestValue)\n",
" return bestfeature,bestValue\n",
" \n",
" # 搭建单颗决策树\n",
" def createTree(self, dataSet, max_level, flag = 0):\n",
" if flag == 0:\n",
" seqtree = self.cur_tree+1\n",
" self.cur_tree = seqtree;\n",
" print('正在搭建第',seqtree,'棵树...')\n",
" #选择最适合的特征和值来构建树\n",
" bestfeature,bestValue=self.select_best_feature(dataSet)\n",
" if bestfeature==None:\n",
" if flag == 0:\n",
" print('第',seqtree,'棵树搭建完成!')\n",
" return bestValue\n",
" retTree={}\n",
" max_level-=1\n",
" if max_level<0: #控制深度\n",
" return self.get_mean(dataSet)\n",
" retTree['bestFeature']=bestfeature\n",
" retTree['bestVal']=bestValue\n",
" ## TODO: 使用self.SplitDataSet将数据集按bestfeature和bestValue分割成左右两棵树数据集\n",
" lSet,rSet=\n",
" # 使用self.createTree将左右两棵树数据集构建成树\n",
" retTree['right']=self.createTree(rSet,self.max_depth,1)\n",
" retTree['left']=self.createTree(lSet,self.max_depth,1)\n",
" if flag == 0:\n",
" print('第',seqtree,'棵树搭建完成!')\n",
" return retTree\n",
" \n",
" \n",
" # 初始化随机森林\n",
" def __init__(self, random_state, n_estimators, max_features, max_depth, min_change = 0.001,\n",
" min_samples_split = 0, min_samples_leaf = 0, sample_radio = 0.9, n_jobs = 10):\n",
" self.trees = []\n",
" self.random_state = random_state\n",
" np.random.seed(self.random_state)\n",
" self.n_estimators = n_estimators\n",
" self.max_features = max_features\n",
" self.max_depth = max_depth\n",
" self.min_change = min_change\n",
" self.min_samples_leaf = min_samples_leaf\n",
" self.min_samples_split = min_samples_split\n",
" self.sample_radio = sample_radio\n",
" self.n_jobs = n_jobs\n",
" \n",
" # 向森林添加单棵决策树\n",
" def get_one_tree(self, dataSet):\n",
" X_train, X_test, y_train, y_test = train_test_split(dataSet[:,:-1], dataSet[:,-1], \n",
" train_size = self.sample_radio, random_state = self.random_state)\n",
" X_train=np.concatenate((X_train,y_train.reshape((-1,1))),axis=1)\n",
" self.trees.append(self.createTree(X_train,self.max_depth))\n",
" \n",
" # 并行化搭建随机森林\n",
" def fit(self, X, Y): #树的个数,预测时使用的特征的数目,树的深度\n",
" dataSet = np.concatenate((X, Y.reshape(-1,1)), axis = -1)\n",
" Parallel(n_jobs=self.n_jobs, backend=\"threading\")(delayed(self.get_one_tree)(dataSet) for _ in range(self.n_estimators)) \n",
" \n",
" #预测单个数据样本\n",
" def treeForecast(self,tree,data):\n",
" if not isinstance(tree,dict):\n",
" return float(tree)\n",
" if data[tree['bestFeature']]>tree['bestVal']:\n",
" if type(tree['left'])=='float':\n",
" return tree['left']\n",
" else:\n",
" return self.treeForecast(tree['left'],data)\n",
" else:\n",
" if type(tree['right'])=='float':\n",
" return tree['right']\n",
" else:\n",
" return self.treeForecast(tree['right'],data) \n",
" \n",
" # 单决策树预测结果\n",
" def createForeCast(self,tree,dataSet):\n",
" seqtree = self.cur_tree+1\n",
" self.cur_tree = seqtree;\n",
" print('第'+str(seqtree)+'棵树正在预测...\\n')\n",
" l=len(dataSet)\n",
" predict=np.mat(zeros((l,1)))\n",
" for i in range(l):\n",
" predict[i,0]=self.treeForecast(tree,dataSet[i,:])\n",
" print('第'+str(seqtree)+'棵树预测完成!')\n",
" return predict\n",
" \n",
" ## TODO: 使用self.createForestCast函数更新预测值函数\n",
" def unpdate_predict(self, predict, tree, X):\n",
" predict+=\n",
" \n",
" # 随机森林预测结果\n",
" def predict(self,X):\n",
" self.cur_tree = 0;\n",
" l=len(X)\n",
" predict=np.mat(zeros((l,1)))\n",
" Parallel(n_jobs=self.n_jobs, backend=\"threading\")(delayed(self.unpdate_predict)(predict, tree, X) for tree in self.trees)\n",
" # 对多棵树预测的结果取平均\n",
" predict/=self.n_estimators\n",
" return predict\n",
" \n",
" # 获取模型分数\n",
" def get_score(self,target, X):\n",
" return r2_score(target, self.predict(X))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### 3.实例化随机森林并对boston数据进行训练、预测"
]
},
{
"cell_type": "code",
"execution_count": 8,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"正在搭建第1棵树...\n",
"\n",
"正在搭建第2棵树...\n",
"正在搭建第3棵树...\n",
"\n",
"\n",
"正在搭建第4棵树...\n",
"\n",
"第2棵树搭建完成!\n",
"正在搭建第5棵树...\n",
"\n",
"第1棵树搭建完成!\n",
"第3棵树搭建完成!正在搭建第6棵树...\n",
"\n",
"\n",
"正在搭建第7棵树...\n",
"\n",
"第4棵树搭建完成!\n",
"正在搭建第8棵树...\n",
"\n",
"第5棵树搭建完成!\n",
"第7棵树搭建完成!\n",
"正在搭建第9棵树...\n",
"\n",
"第6棵树搭建完成!正在搭建第10棵树...\n",
"\n",
"\n",
"第8棵树搭建完成!\n",
"第10棵树搭建完成!\n",
"第9棵树搭建完成!\n"
]
}
],
"source": [
"# rf2 = mycache(random_state=2, n_estimators=10, max_features=3, max_depth=10, min_change=0.001, min_samples_split=20, n_jobs=10)\n",
"rf1 = myrf(random_state=2, n_estimators=10, max_features=3, max_depth=10, min_change=0.001, min_samples_split=20, n_jobs=-1)\n",
"## TODO: 使用rf1的fit函数对训练特征数据boston.data和训练目标变量boston.target进行训练\n",
"\n"
]
},
{
"cell_type": "code",
"execution_count": 9,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"第1棵树正在预测...\n",
"\n",
"第2棵树正在预测...\n",
"第1棵树预测完成!\n",
"第3棵树正在预测...\n",
"\n",
"第4棵树正在预测...\n",
"\n",
"\n",
"第5棵树正在预测...\n",
"\n",
"第5棵树预测完成!\n",
"第6棵树正在预测...\n",
"\n",
"第3棵树预测完成!\n",
"第7棵树正在预测...\n",
"\n",
"第7棵树预测完成!\n",
"第8棵树正在预测...\n",
"\n",
"第2棵树预测完成!\n",
"第8棵树预测完成!\n",
"第4棵树预测完成!\n",
"第9棵树正在预测...\n",
"\n",
"第9棵树预测完成!\n",
"第10棵树正在预测...\n",
"\n",
"第10棵树预测完成!\n",
"第6棵树预测完成!\n"
]
},
{
"data": {
"text/plain": [
"0.9396619095484052"
]
},
"execution_count": 9,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"## TODO:使用rf1的get_score函数对boston.target和boston.data进行预测并评价结果\n",
"\n"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.7.4"
},
"toc": {
"base_numbering": 1,
"nav_menu": {},
"number_sections": false,
"sideBar": true,
"skip_h1_title": false,
"title_cell": "Table of Contents",
"title_sidebar": "Contents",
"toc_cell": false,
"toc_position": {},
"toc_section_display": true,
"toc_window_display": false
},
"varInspector": {
"cols": {
"lenName": 16,
"lenType": 16,
"lenVar": 40
},
"kernels_config": {
"python": {
"delete_cmd_postfix": "",
"delete_cmd_prefix": "del ",
"library": "var_list.py",
"varRefreshCmd": "print(var_dic_list())"
},
"r": {
"delete_cmd_postfix": ") ",
"delete_cmd_prefix": "rm(",
"library": "var_list.r",
"varRefreshCmd": "cat(var_dic_list()) "
}
},
"types_to_exclude": [
"module",
"function",
"builtin_function_or_method",
"instance",
"_Feature"
],
"window_display": false
}
},
"nbformat": 4,
"nbformat_minor": 2
}
{
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# 随机森林\n",
"## 1 什么是随机森林?\n",
" 随机森林就是用随机的方式建立一个森林,在森林里有很多决策树组成,并且每一棵决策树之间是没有关联的。当有一个新样本的时候,我们让森林的每一棵决策树分别进行判断,看看这个样本属于哪一类,然后用投票的方式,哪一类被选择的多,作为最终的分类结果。在回归问题中,随机森林输出所有决策树输出的平均值。随机森林既可以用于分类也可以用于回归。"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## 2 随机森林有什么特点\n",
"1)对于很多种资料,它可以产生高准确度的分类器;\n",
"\n",
"2)它可以处理大量的输入变数;\n",
"\n",
"3)它可以在决定类别时,评估变数的重要性;\n",
"\n",
"4)在建造森林时,它可以在内部对于一般化后的误差产生不偏差的估计;\n",
"\n",
"5)它包含一个好方法可以估计遗失的资料,并且,如果有很大一部分的资料遗失,仍可以维持准确度;\n",
"\n",
"6)它提供一个实验方法,可以去侦测variable interactions;\n",
"\n",
"7)对于不平衡的分类资料集来说,它可以平衡误差;\n",
"\n",
"8)它计算各例中的亲近度,对于数据挖掘、侦测离群点(outlier)和将资料视觉化非常有用;\n",
"\n",
"9)它可被延伸应用在未标记的资料上,这类资料通常是使用非监督式聚类。也可侦测偏离者和观看资料;\n",
"\n",
"10)学习过程是很快速的。"
]
},
{
"attachments": {
"1.PNG": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAPMAAAAwCAYAAADEikwlAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAAFiUAABYlAUlSJPAAAAr9SURBVHhe7Z3tZyNrFMDvP5hPo4QSasdyYxllo2wsG5dYGqUpK6VRK/3QuFZKox+yVEqFSqmUK0uFlaVSSyjDCuHcc56XaTKZyVsnTTs9P4bNdDKZmec57+eZ/QsYhgkFLMwMExJYmBkmJLAwM0xIYGFmmJDAwswwIYGFmWFCAgszw4QEFmaGCQkszAwTEliYGUbT70JtOw7Jk47aETQtKP5tQf4/W30OlpmEuXuSgEgk4rOloHqvDmRCgX2eBmNwjN+VoK3+Fj5saHyJQexLA/+1QOwaZIwY5K6C/5X5LPNNAWJigA3InC/01plnQOtrTAizsdtQe8JH5zgBxloOGn/UjgVin2fAMND4ddWOgJhLmO3TlNLWbI3DTw9qm9IyZ857al/IuCtDwnhKw9SFykdUjp+q+K/gmEuY69vK7bJKsKjognkuNCBn0HgnoHyndoWKHrrXUYis5qHZV7uegN5VDqKRKMbPakcAzCHMbSi+lcIc3W+qfUxo+VkEkxQ3TXa1K1R0K5DE+4t/e+JsQL8J+VV8rmidg/IHZhfm+yqkaHBxC63bxTg4IdV2Xe0JFxQrRyImFH+qHU9Icx89ggBD1dmF+TIrBzdiQelW7WNCiw6pUqdhTHTaUP2E92fkMJhYAkqWgjKKMwtz+9CUwhxWt4sZQIdUEyzXfRuqBymIrxpibhirccgct8a7j/ctqOwkIErx+Io6Hs/TuG7P53baTShsRMGIGGB+rkKH4t+7OuTFPrqmBBSuXWf+U4MMzeVpXN2+Da3jDMRX6HkYEN+sQBt/o3uZhwTdtxGFxEFztmunxBtdW0BVghmFuQvlDboZ3DZrwE52yPktJ9s4y2Vf58FCgYyly9BUqdn2SQqiOOETx97pUfumCAnxHRQImkT3dcitGWDoRNtvedz0yCRW/BDjXmXt4h9TYL5JQ/lGilf7G7nTMcj/EB8lKh8wOfejatBbqCToekWtGK9/3YLYWg7q6CZ3TpJ4/ujw+SfSgBw93/flQBLJswlzvw5Z+nHc/AaKwSHalRbqMZt52FJnWx6984y8Hh/FbV/lRL+BsYmWbSgTrC26Rzx4S2Ug/Nu7orBsmu53EgbcP4/HJ/I4SaiQMsFrEucxUCkMhoHaCm4N3IsS/ElzuXuaAuN9SVp7QQfK7+n+DMjU8Gxa6eHn3LU6ZCo6ULLoe8G4+bMJs85sjtNA/Q5Ud/JQC7gg/lR0zrKQPX2kour30GW0H7X1hoRjOcgEjc9k76IAkVB6Nj/oyR6B7KXaJcD9G1LRpc+GHVJHccyRaBNJOuUqd44scR6TrPQgtyWw6PwDykImvybFrG0ovUMhvVIfCceoKWX1pwH5dROsndqMdWP9nLJQD2C8ZxLmh3bODNQ8O2XIHbFQO417OM8ddNl28R4W0G63MB6pPLwVh7Ya3vFyc08K+pClc9DfdSmC//LofnvPH604dKLNvi5CZsMC820MDIypUwd16PpM+N5tAxq/6HsqoeVlIZ3E7YMV1MI8JKgjdKF10Rr+bW3UJrnHPwpgGjEo3KjPI2hhDqaGP5MwO80iG2VPDURtatTb6ifKzcM4xjEyIUEPPPomCzXSbKjl0yp5ElmJ4QCq/csCNW1uDSfcC5Fn+zyLz8ycc7Og6DXZdAmS4mW3EPV1I4mPVdOJJdwGBaXxRY3xSMJpWHH0rnNgDfZI/yoJ13xix5RzXaPKwkncDgjgdMI8iv7epFi7R/mED0Vo+raILk2YVZHb9ybIHZmmXkfHjT5w+ywD5m7dFXstDxr82GtuikGBEkrXK9PrhFs+ySqMW6XCHoyZH5qNrCOXPdOKQ7nAwmgYFpR+yT8TUhFMSDDp6/JQFtrtH5y78wmzbm91ud5zsSxhdoJ8H21MLtTbIg7ZZEZildsypHcXvFplVmhi0OSaQ7k0915+AkxbMil4PWieFKCmJ5xOMvkkq7xdcJW5xW1ECLTwq0Rbc58Wdgy7p9MIng4DR+Jlp9FpOJs9Mg+nwbH+ruSeTWW1jq9X6s2SYmYnQeGjjSnmmbpepjKLYvBsfDif0fV5Jhb5AZp8s5YaFC8+AaZjT3X/wm0esB4/VOzrFTPSsSIx5rY22rMbtUKtA6k4BhtTeu6YWiiIcePxsCDELfA6U+520/WcHpvN/tOE4gcMR/6pyHvV1t+1LoF+w1Fet1XIrKNnt2pCYez8WVI2WycovLWxrD9PX67SGgnj4w28kUWY5H4X6ntJiKPrH/s7DdUbmXE0Y3Ec7Gl+UF5jODufJqHHR4ZCYtIPrmV2BNYdT8t6LLmgowv8tbC5hFmXqsY1plAOAxXB+JhZW0yXpRX5D7pWj6y7UkrjYl+nndWQz6L9LS4/Y0zvQH3Wa9rqYxi5gZbWlvdr7Pmf2/FWnrTO3G9BQcS5uP3jEUOpeGi4DDEerS3dJYqgaO5bkLuU55blCqpDtqD4Bl3gwYEYA8VuI/HdqwAn4haFCnEoXMmGjtTpsCR00KWlbqvEUVt6Eb0OVLdNMAwT0n6lPRQeqksn1KIG+wbDKxI0mlc+LrtWEMZGEVrjpoouPRmGqAmLGvY9jjeVwlaSUB6Ivx2m6ACT1tuE7AXe/30dsmsWWBaeU9ede22opE2wDprSKt8UwCLlIBSeqkP78ZQdYGObH4a6gqSGmT4hgAO0iwNP5xnzIFtHKUh9mn4r6RiLHuTHiqPFRUJF/E4Hal8x9nOaCVBJ4UT1W4bW+ILfm1LwQ8d9AwrrNP4GxPe88xmdizwkY2qOrMQguVNxusD8cNofqZqBFqxylJUuu099meJakzrFJoQd2oJG9+vQPLDk3KJr2qtC27cyojyQcb3Z9Cqhnbg4n2gJpVf+OK2jcl/+YlR5CWM1Keey7N5sb2YT5s5xGo/FhyTcLtW5szBkrOYXAvTQHfLjVQvzE6ETbV7hDHWYJQYrHD+rUPVxxXXZaxbvkJBu8zRVmFmQLx+YVLpa/qopT6TATPMgaYDSSrB0AiL5fYHSLBIW8w3W63WzF0C/55HU07H5aFKMYumsa+FC5yjpkwDTZS88z6x93b9KEMc5GOh6ZnHOuCitkYX2dLXVembvppv5CEiYp0uAUYP9UAlKJ1J8EgBzu9mU/Pq3DE3yiMj9GnCjumdZyF/jxKIOo80EWHg93g/zNSfAAuauAklabWSkh6zQQ293bdiN/43zAsOf2FBzC8bNfsKqqyO+cfc4VH5gkks8A6JERtnuPioZSvB6NY2IbjjXwo9HEpAwS5dhbCBPXTzWcHO98yCVFgsKWW+kOqV0dxxlQWWwdRq0NpS+4gSiB+q7+otCh6Ddr9eJ0wasxwGtdPsiJ1ZbGRaOx5Ak67KY1+Yd29pnafn3rTlfoCCUQYDvACPltRpFBZSUMfYI6h1gbiX2SAITZlH492gaaeyjVtUtnEYUkif6CBtq2ybExPpQSiTgce9LEEirxF0VMh9SkEmjW3bZgurnOCQ2M5D8WICGsgw9lGCqXfomH3TTiPrIPIJ+B2o7SXyeKllGya93Kch/bz2u40/Xu13bPC4zJdqe9u2cwbcLByfMotb2giwZdfKs0ANFl9tDnikx8+TvhWKWiFqzPNgTvgie3XuzfaB1n+MWWjwrqPd4qwbdWm60S+eFLbRgAoLKUFsmJF7D/2gxmRe0BPKmCNZWHvKHbk38ApdAMgwSsDAj/HIChlkKwQszwzBLgYWZYUICCzPDhAQWZoYJCSzMDBMSWJgZJiSwMDNMSGBhZpiQwMLMMKEA4H9khDugzrIaIwAAAABJRU5ErkJggg=="
},
"2.PNG": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAiUAAAAzCAYAAABIfXhEAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAAFiUAABYlAUlSJPAAABSBSURBVHhe7Z3/hxvbG8c//2B+GiUsoRrLjRKlURqlcYlL43JTKkvXdaU/dF2VS9f1kVLpxwqVUlkqS4VKqVQJS6iwPJ/nOV+SmcyZb5lJdjL7vDjszmaTnDnnOef9POc5Z/4DDMMwDMMwKYBFCcMwDMMwqYBFCcMwDMMwqYBFCcMwDMMwqYBFCcMwDMMwqYBFCcMwDMMwqYBFSYaZf2xBqZCH/KNTmF6O4fRJGYqHRSgUKtD+NFevYhhmF7A9MkwwLEp2yGw8gMH7cKX3ug2to1ao0u5P1SfYGcHx/TaMrwbQzFlg/dKE/g/5l+GzHOQe92Bfh8HZWRNKNLCr35NmflaHUr0H0yt1Yctsuz7jlyUo/zna2/beDjMYG+zOXHpw+pfZ9tylDf3v6iMcsD1uCvffYLLUBqFFyeJ9E6wcGo+pWC0Y0osue1Az/V2UPBxfiLe6sUxelVb347BmGND8ShNq94pQPLBs91SVg2MYrU+gn9uiE8HFMeTp3n9S15HBH/g/T/qwUL/vE+RtFm5jf9uydUzQCK0HaOQRhMns34q7bZalBr1L9UIbu6nPHPpPLCg8G/LAvmQCnbur9ik+NtmcT3lag/JhEfKWvY1lyT9Hu1uH7TEGEfvv+ASKjjZpQH8fb25IsjaGRI+UoHEVVGPX33p8PeENyNfkj4RcYYirKZw+0KKiAK2PGzbvAr28d22o3dHv5Rzk7Ez/KUvRon4XHttBDqpvZur3PeLbKVQsvG/nOxhhriZwgpNWgSaSqCxtxILGmU8b77I+8z40LAtqb/ew3beFuP9ynMrFGNQXszH0XtSgqN/L5CQo2B43ZIP+O39bk+2BzkVme30Gx5DIomTZ0LkKnKrwowtbxKRxlmGJugkJDYSCqzmMX1VFBMv63eRpzaH3GP+G6nbJF/IiqtDdOyuVgs5CjzLOLYvEJ/JqC5EjfCsbMUdHJLuvz+xNFXIWfqfMjtDRmb6uLCPAsb3AyzF0HpGjgGLU6JqzPcYhav+dvCyKdi2+nKgrWSObY0hkUSJCjWTEhyfg2dTnLWXoPsLlBjM/aywHQutxL7aKFwOr1YD+T3VB8xOVLX6GXRiO/sxD7lEXP3MCnTKKInU97Sw+tlxh7+2DRn8f20ncr/AsbaTcwXcwcy31+TmEFnrlxuWFG4sMS4v2QjER2wtU0VCjk8D2GI9I/XeG7SDbtPVRXcoYWR1DIoqSCZwcqgH3j4G65kaEKOk1fsLlRjOH4bPCciCsvI6bniQH1tr6cppYv7YLQznJVv6diQ4t1rj3ghl0H+G9utvZeX8SngEZfuhoycpG8p739/rqM3qOk6BvBOcGMseB9rZss5yF9vJNXd8UEeY23GO2x9iE7r9KAGa3r2d3DIkmSmzLMmRIZhY4QcrX+AmXGw8pTj0Q4kDViTsQjrvQ+u/Y4Z1R4qWF3rq9045flSF/UITyk93tLonN91Oo4H26ljCs6vPeAmONMEuX11kfMTHm3AL2hrM4by1z5XL3O7FtY/ymBd3PzvZne0yAsP1Xvc4vWrnXZHgMiSZKPjTVJOrnOcrELXodD3wBfOuIjiXuKeWXrC+/MAKZo2FB61xd2ClT6JSxfcJG/ZY2UvYUmtdaH52Evqe7PbbJ9J/V7qnC0ZDvjwf70H91tD60M7FnZHkMiSRKdOKQ2GLlNYEqBZfLFeHki7rGeJJoot0eMH3bgNItrO+tsjwwipJ1X6traGSlJ12YODxGHXkLGS78PoDjB3mZ/HtQgfY5fsacDqoqyWu/NKAXMSolzpEImR+1tBHHDgs7110flSejt/EzK5LaHbc3zGH0oiK2NVt36rIf4T0YPJfXclYeKi9wTIphj/PPp9D4Rd5T6qvdr3jxh+7TFuQftGEU6TaH6b+raL3vRgsceya0i/GutC+qb+nJKYz9vg/dnxc1OV7R/Xk+gNliCqP3Y5htFOnKahtsTgRRohOHsPgl8J015GvoC+9LOPJamUHvsR4I89D8kOGBkAQrJeTOVfTBqkDtUR7KRwMZur5EBX6AhuNI/lU5Gp6TvA3a2XSrAicX8r/prJGcVYZy2YLKPxO0TrVkFvGgKhKO1D7BCXM2G/H0Iq6/PkuRZTzk64Yz60FN7447aMIgw+YoEiXvnqAToDzfu1Wo3SlC/fVYToJfZSTXuS0+fP+V52eQaCZLUAnFt7H/3kbBR+McjgdVfP+o0Yzg/ot2IdrQx5G4HMIx2lHudh1OlX3B11Oo0fjjdT7Rjz400N6s8jEM6V+uJtChpGaLxu/NohbZbYPNCS9K9E0LW/bhhMIfOqoTo1h1nGTV+23KXE7G8v2yO1nQTgOZta2UNtZ3PTokO3sJOqTmCd3v7gecVng1guPba6IODVLcU5UMNjxS4i/quQVqSSYwIdlmI56vTUF9dGg70q6EtNjKDphje4t8BCxiglLXs4XcnizPR8FJXNR3Pelee8S2yHjY/ivEnTNXTot7uTNpJeCtiGdZBfZfsc0a39tryVWLeazXeqRBRzpdqQe2/3H0YXFsANVjk8TPDLdBDMKLEt3QvopwlU/inYBD4aomdL7EXY2aQu9pM2Lo2s1iPof5ZYwyj1sPiVC04v6qgTBzUSbqGyoXaTl5u5cB9Xba5aCglwMD1i8X/QZYa0a6vq48/W8Nioe16H1PiYHyP75DgM1GfHKuUlAfPTA1P6gLIUmLrWyfpHfHpRCRkK0m0m8dKFNdXZO4imja+3PI/kunVzvOY0Gctr2A4Z9lKN5rQj+ShxDcf/U5QeufL9GRafP2b/3ezk0a+F2PZH8ovhirawpt85sk1Ga4DeIQWpSsjtB2TyRL9M0i4eKhoKavq1D9NyEjpx0s91pbPl53d4z+1ANhDkrXkVW9VehZI2rdVWfGuzx8bYC2SIPuU8YBZsXi2xCGX+0dQa+7evdFCQ5SdQusuk9kT0coAr5DJBvZVn3QO6rfsqD+ztso9ICS+sn2p0HYhC1xBZCIVNH9plLKXn7czykMzyeizy8n8XVv2Rb5W/a5kP2XnvM1diyd6K3yAZHgBPqvnniN+SQ6smGPPNjQwt8RhZh1xRKHKU9S37vl8sfXHrR+LUOJHkFwKw+Vp13vHJUMt0EcQouSUAdC6XwSr1AWqsqST8hp8m8VPb+CTDq6VRBP0JQFr90qQe3v9YQf/EyKMKBi9L59e4Qj0W4fT3kMh+7Qrmja8myB6Abo4goFq1hXDgqrTqGLg0jjnY9xhRQlSxvxW07Zdn2+d6F2rwE9n0FnP0TJHPp/aPvfoNw7gTWfNjr205cjHqC3TwyfyTHHNYkvI3+2SWzT/qv/LygPInb/1dF688Sr62qOMqwSZO31059n+u4OAYR1rNJD8fQcdTmQSz638f8CIt/ZaoN4hBQlq2UZv7UncTohVdq4locN/js9C8SgXu2oMJarsqqB3Vv1SP1FPwo8tYjnpmQ0ZCzQhm9Y4sDJXwhS+8S7qQFqYw5adw1DKFGyshHfpLEU1Gc/REk6GP9VQK8aJ4SYy8TpRXvPbrFrnIw37L/aYTUvqUTDt//q3CfjxKvriv9rOmdrKfztOSU2oeI6d0vbvLx3MspiQfP9aobS39WVo+IgY20Qk3CixJbk5i0qVqF3o3ARITCfsLZChrHM24nl6ZpuBUzJSRsde/sDv5P2hDYtSSbvqUhJtrcGa8N39wUtah27b7QBBqyfrqONZl0gzD4PYBzV5VWixDenJJSNICmoj36vSOvBabOVXSAiJRnfGqz7rcuRlEmY1G6OnR8b9l8dCXBOznOYnA9hGjAnrOPXf5fReiUgpv02nF7ob4pjD/3NK+fLuLSzSsp3TcD6Xqh7N8O5i7bYOpY91NjhO3lnrA3iEkqUrJZlzCExge0kS5MqFGIjxC4BEQ7z2k6sGtg16NP1oJCUB7GT9yI2pjcyuW4rSa4XbSji4Nr+rH6Pw9UCpu9bUKbvqS5FwsvjpzV84XWsRb30kk5AhGD6rgHlwzIci6dlrrwbh9GIz9A7eyjhugrFO3nI1wOePxRi943DRvzOM9lWfeYjaD8sQvEgj4Oiv5VtmjmfHlvZAWK3xZYilkna42IKg6Pyxt9z2W/XvWedR7H+8LVQ/XcBo5e0FF+DrpgvdCRg7UBB+gwtABLqv9qxkXMQfa49B0uPMaZ5TEbyKdJRceQ8roTB+ufN39XFdbujsPjpnJukI+0fKclaG8QllChZLsv4TfwoDGTo3RzlENsXA8NGqtN4KUCvyUEoR3foa58g5WnFfWqwB4vzYyg/PIFRnElh1ofWowqUHtagQm0UMKl6sUwGXfMKpnQEN1537scnVJ/wFZ16sMlB6RW+Kxmt8Oqdg4/wZFQUhna3VCjyIUSSf/6O9gr8DDCUjQi2UR8aUOXWP7GlMSD/gc8pCWDLEcsk7HHWb0H1fgmqjyvCu99UlCy3lTvGXL3bxLRDJUT/XTqo6mnJXztQEr87D9uixH5p78n1X/k3NQeRbTuiHivh4RIYOH/R7kfLnhOi0NuEHf+jtwjjdb/VAxFlCcgpyVobxCVYlFyNoX2XvgAWn3DR+C/ZcLlcEwauBsAKo6cXuKXSK59EYewchMhQ9j7WO+3oA276IU4MvX6UoW0oSmRiGB04lIfmeznkU1SA8miKRwNXIrPuO/4JntILyD/qiEz3yasKFO6V8T1X53zMPuJEQCcmCiNC76cu34+2zQU91GppgF7tY7eRX4PO59lCfeY9qItzgSbQwe8hhIwnqv34RFcPZMSy8Ht/wxM6d4wK5W8mSvSSO9kjTkgUcaMTll+Sg2BB9bWpH4Xov8qTL/4xwHs4R5svQBn7r4U2JMboqwVM3tShWG5LYZZg/xX2TBPx25E42Mzl5ND9QoFvPUCbp8mMIr/vmihecPz5zeP5Q+K8D7RHrI8Yn74PoEUHr5G9+9wH6WgGjevZa4O4eIqS5dYoU9HLMCIp0/B3UeyDuKxEkOH45ZPQYCHDaKaGkGuF2wglbR11amfsRDrsZLs5ByKOKNERgBp0P6ER3JFeS/5uAzofvd9NRirQu/E5MXH+uQO1A/l+JfVws+WR9uLaKYzX+41aMpKHF3mhBg1D8vbSwzEVH4PdWn3EmnjAri29xTDievRNQUwkSSyh0nbmXdzgOKJEe9PocQ8+taGs+lbh4TH0vnhL6zD9d3bWXB3F/mKEo/fqOHV9PLvrHifRf68m0P2tKCb0/GOPdqRHNzxUuzxFfZvQ1ae6euA8qr0G7TdtubTisRuVHM0infzqJRo0WWyDmIRLdI1NGFGi1J9XPolaX5Mn0a2zp6JEhACTSaQjD8G1xfZbDxr3ilA4KEI7sd1JMUSJTtCKetCQCj0m/kRMYYB+EQtEDRq+O2qisqX6iEeKC2/HB35KsCcyYonjT+xbQ96mwbnahj3GESV6yd21qyQA7r/LPBDjuICOZg0F0fIZXpdD6NEzq0xwG7jYkSiRUQ7/5RvlRXuoL/ncj4p5DWsfl2/UuvVmYdc15pRzsD4I0nMZ6NkdUuxZm+xOMrK5KNGGHPVIYznI42cGLLNEA+/L75YUuRQxoXCu+osdmahmO/Y+EbZQH5F3otaPUWxVPMKvYtDxODjqRpNUxBKZYz+3XJG1LdljDFGil8ON22N9uUH992oBC8OEJHeyuHNTyNFsP11bBvrQ9JzAuQ3c7EiUqFC3X6Kr2pVhNC6xNc/yPglWGGaAx5sq1Lp1Eol0dH4LLYOsD4Kf21AmFW/vaDCGzuMaqviwpWM4gGpzUWLekhYOcew6beX7pC7EZii2JtNuFppEzIP6DLqPcAJBzyHqkBFE4vU5J4+L8rnooVte4l0Kf3cy8Q0nwYjl/EMLitivXB6syR4pedxodx7lqO/uhxuLEmXHXttjA7gR/ffnEI7FEvPaTikxH5EgwDHXLj7mYzhBRzN/Rx3gp0oe29u8zMJtYGJnoiRoS7DcleEOec4uOlA9KELzzGda+NgyrvmnlUTWrSlB6/2xWoP0DuMJT5+ytON8loMNRQklg4otaZtGHeTnWk+SOr13AcPnOGAUilBWORsuxPLOto4YT7g+YgDNQ+GwDI235pYRfYE8nGRuYDZIKmJJW3OfUyIh9XGvvLgt2OOmooQSG8V3bRg2JoThBvRfMUnjPTpowZCiAjjmzi5OoS5Oaa27nru23IHnKh4TPLeBkZ2JEmk87pDP/Kwp1GRBJfg4VCb+XKVnBwREQCgEluia/xYR69b5OnS/qbMbopSvIxi878LJ05pKeNTFaxCUnn6y90YaQnhRstrn7yzYFwxhUV+Eh4IerTi7Y8tcTeCEstATXrN1sMv6iCU+9NDPWJGskBHL/G9dmJrsLaBMLgYweHMCzV9LSoyo4ukgbcEeI4sStUxu/75UNlkGyHz/ncPo7xqUCzLCS8UqVKHxtyFBNBLcBn7sTpSgQdLTGQOPmY8MbZ9Mes1/S4gOZOiMcYvXICiSoeS9IYXb6F9c6/JNEiSXjOgP5TDt4mnNu6kPhWO3d+7GviIiliZ7ilk8kw/X7fHN/65x+SYZuP9eP1lrgx2KEuTiGAoJT2b79EA+emrj4H240nvXM143Fo9zxsWSGO10Ia//gQpBxuZ6RQkxO2tAkQSD+j1p5md1KNU9lnS2wLbrM0aBRfkMPKDboadWG2zJWHpoj6br5uJ17P9W7PGaRQnB/ff6yVIb7FaUINPXVe+E1aj8HELr3rYV4h7zvQvVgzwUD6vQ/hTzJv3oQl0sq9n294vf296nCjIMsyJBe5zRwVdkf3ppQT9V/S+2Rma/2bkooTDQ6EUTOl/iLuNMofe06Uo2YhiGYRhmP7kGUcIwDMMwDOOGRQnDMAzDMKmARQnDMAzDMKmARQnDMAzDMKmARQnDMAzDMKmARQnDMAzDMKmARQnDMAzDMKmARQnDMAzDMKmARQnDMAzDMKmARQnDMAzDMKmARQnDMAzDMKmARQnDMAzDMKmARQnDMAzDMKmARQnDMAzDMKmARQnDMAzDMCkA4P9k4UBrilysOQAAAABJRU5ErkJggg=="
},
"3.PNG": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAS8AAAAvCAYAAACyqUxsAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAAFiUAABYlAUlSJPAAAA1QSURBVHhe7Z3/ZyPbG8c//2B/GktYwtpRPrHEZaPcKBuXuDSW5rJSbq2rnx9a18rSWFcvlbJCZaksK0uFlVJZy1CGFcLzec63ySSZOXMmM9Nmdp8Xw3Z2mi8z57yfr+f0P0AQBJFDSLwIgsglJF4EQeQSEi+CIHIJiRdBELmExIsgiFxC4kUQRC4h8SIIIpeQeP0AOKM+9C/Nju7pEbQOWkbHUW8i32HzcN43obTbgSSfcHRcgvLrIbjyZyJfRIrX9LIJ1tYWbAUdVgsG7KK7LtSC/p8fBTj8zF+KyIjxm9L8fm/XAoUo/GhC7Rcb7MeW75nJ4/EhDGfyTTYI96oFxSc49hKrjgu9PQuKrwaxBGzytrx6r9Sx0wGHXfT5EApB/8+PGnTv+EsRCTD3vK6PoChvfv085FHP+tCU1xQOuKwR98FsAp0dJT5FaF2tOaunDowujqD2VL0WGp5P8v82hdsOVCz8jh+n8kRC3B40LAtq51xyYuFe1OV9suHoWp5c5rYNZTknqu/ivwcRjrF4uec1+aAq0PkmTy7j88Aa71MaXIQZfFKLe7+V1CuZuTB6U+Uet/WyB5vzJIVIW3u9VEM9598qRhHoDcXUluHrgrjfKgIJwvPAbDj5Is8RqWAsXv19OTG2T2Asz63wsSVDTI3AEZnhvm94Ib71oivClwRMTitgWQ3ofZcnHpgphouFLLzB7wNoPcZo4c+hPGHCBNplOSdQTMME3nlXEdfoBI5YC0PxGsPJtnxQ+315bhUvF6ATOCJDXBi8KopnsGVB5TRpwl3khGphaYJ7xYGzXfxez9qZjK3hn8yLipGL+o7hJr/PW1B+G36fPaOvEThiPczEyxcOVkLj9ikOdPmgNAJHZAzzIp7I54AecPtWnl+X0Rm0/hk9/MT7imExfif7OCOzKMM7Y6H2wkELWh/luRXmRl8ncMR6mInXh6acDLrK4RAO0fVm122Gpf6JuW3zic6fGct/bUjYlwSRc9UJRUJUscnQQ5pXHDXemmf0qeKeBUbiNT625YPS5D+kZaTE5GbA81X8eWzFbgXImsl5A0qP8LM9KsPRJ/xkrEBwKs+hQJX2zmC80KKhvHrDsO5rHw53CqLg8LgCRx/xPdwRdPZK4tx/G9Bd8Ugn0HmO72GUm/JFGboUyZXKAW9O3vBHwkC8HOjsyAdVboc2BU7fN8Q17OFvYG/Qz4cD3Rfzlofmhw2RL2bkWBHAlQlvqwK13QKUD/owYePmDj0g9OAXCw4y/GJ9Z/JMKKzq+qgCJ5/Fb4+PS/geZSiXLai8RZlxZViNr798Rwav2L2qQOerPBEKvoaq7GpSJJ7Rf56smZYIJlq8fL1bRkfAoPjp+Ka80ASHVccJLl9vXVwhBOL1TCZl9rD2AlHVk54OfrZlz1CISAnaN/KEGoNRIjAbwuGTJaFG74d/f5noHxxIQVfNpD5UKNi6kifC8PVumRyF13GqmIQp0eL15QRs/hB0+YZ5vis8oerC8H9NaH/JIPX7bQyTlNzyyUUTmufJ7eTUdcG9S3C46dwn3o0uJ5GFE5Z7Nw8GGycy/+MZxdWQSlXovNypSklE5KOmvQZYSwKnBEkJyOSfGtjbtcBxyEJtdm3zgzwRghdlJO15nE2g+8ch9JL2tATg3EwevsgSxfchnOwdwXBNIx0pXl6filG+CwUuxGpNTqtQfZeN8zw5baXoVUzROpfX71LfQIavVfvEFpSyqtYZ4cDocgQOE1BVrVvxgOb9U16rhxpf6KHpmN4OYHDjf24qNxU+Lv0o8YpqMfG8N6N8V5jHy9pacJyltVJgAfRqX+UkVMVnW2VrVNcwqpHi5fWpmOS7whKq6L2VQl3+MXR2bbCLckA8KoL9ywmM5P8yhn8VxUCwCmg1Gyud0DrxGh6XwH4qkrdsEBeeNqHHPqPThbpaz8fec1ueZ/B2A5aXkT/nnYXlQ1U4y8DSx0UJxYqn7uuf8gTHULxWmKnclFmi30y8fM2pJvmukDwdayhm4XKgdE370Nqerze1HuO/9/2rChw4+02OaTlfFteg6sQLvZ1nvrWsbE7J13bO61CQuTyruPye2cHu+zpGNUK85uGgpVmr6C2TCLREaP1eWtHLhXShAVt/touCFnInoz2vMbSfse+x6D26Fw2wD/rgBqg+G3zFHylXwdemptG4mgbKIwpoIfA8Fp/grCteKuVhmDA3Ei+jnkcXui/ENcGhLhuPBlV5eS8CUzH43aove8KLXcHA87ppQ2nl8+Hn3rOhdd/FHWaw1liepRcvX+I5XHzmlihQ4JwzqBqVioMFhnsNv7e0a/VMwkY1ML3vcduBOn7e0JdlA3/dXRW+4XdW1ah1jzQS9grpeW1Oy4TyiFbHhTKEC9VGnWHToJ75csLcue7DKGCiqOu1OS9vCZxGfHxFrkCB+4Qhs8kqFD6p8XWWr2UV0991oZaBeLFreMFk/gwmp/UHSpcIY1b9N556acVrITEZJg4+SxTUnMqbCwMqO0EIV9ufn2B5gXpkl7hRzss/ASIfPgOvSdBcmDhhHyn2poglQw+frPcR5hGxaiH39IuL912FkhEe1AQ96fJ2GQ55Hmnei7UgRvw9fJVMHybVRi8c1LUEed33weOHCbQukpmjvoNv/jFDVG9BX6sxJuLF5s3coLPCTv0BvXL+WWJ2KmjFywsHdf01npsfbIl4ctPU3ZeD2pLX8yS/wQ01S9grS1MEe0fvyQnE9XlfLcAGhZXK3lfp4RWBljyKCZ5nY2k1XJeipu3zmqc4Sm/wVZXXsmR4HTSmYYvWo/u8fOGgZqIpUQgucom+SdPwXTkQ4nqZ5I/0jszEyzPoT2yo6KKQJLDm4P0KlLaLUGS5uesO1LZtKBar0PE7JagjRn18PsLFazaCIx7G4aFx10d/qe77JvRXLJGwHObrunyDFK1X2TDMMRMvHDZs6xP8rPULs8fEihV5XpMmNu3DCbRhO3yIIpAFllWA5qV4FsxrYjm54Byk8kB0iXfRyFrYbfPc6PhNBSdLGV9z3vflXOGYelqHbuBYkcZN12GPUYYQRE1L0AwFri6uCS5yic8Z1Y7hoSIbFMvRed3ImBuLF0v8s8XuW3hPzKZETBy8F1XPy+XPnc1tpwf1R0sCznvngjQknBXxmq/ZCjhU+OfbmHD1QMvlTRYxIOIkiQevWBUEB3aM8qmZeKHVOrCFl2jonnJLHDdJvCnITvMF67YOMwwpUuo5EygPqQZnn7pQ5xsfWlB41oD2Vfg4Ed6Mfm2je92GGq+isSVGXT5+vKVI/FwHRmHiF7q2cYqTTlbmAg4V/s03Jgw4FgSRpSMMGmE9pKji57eNvSND8WLpE7nxZHiEMYL2ixrUjA80HvI3WSRV8bYZErlxnn/8PoTOQWexv4t7gTqvdxV9wj4x8cVLiGc51m4IZgl7lox0pAU3axfIrXjxJTAJdlT1wbaYXvEyTEOBIFSoomm9CURWx8KboBMSd1eJtYkrXspTjeOVGIiXKoR9k/nE3bPAUDo1uAep6bfbPPESOQLz0EuGBzFj3yjx8icjVQ7BpLKRy7BRVhbjGIxQ+BbJy7nMGKFAAOr+myWs/chqdJb7ed3LxovC8zQOG2WYGS+ZHSVeLAqpS0OjQvJs+//Ec9fc3zTCxrSJlbCXTYUqYW+KTrzc65PFlgiVyI3s/RFeY74S9qKymEpLxF1fhBTLZfo4oUAAIi2w3n3ly39YBS/tnVRllfN++vriJeyVpxrPGOnEy4XR8WJLhLivYe+RIGxED/3s7x4P33nxz9d1MDxuwFlmCfuUiNMqsa7rHipeGGpUyjj5FtQcLc1LNoGCy+VzmHufr+19eGUxaUvEbAqTy0Mo8zxRRJgWFQosw4pAfHO+qHsfhjAo2exhj15Bmi+qwbxVQs6f2C074eI1fluB8vIzVQY9Za+WOy5WHbqOzHMqp+S2A9X6YsWX5zRj9vFlLl7CckS44zf4ZfzLIdjShD3zPdiXxWvwGn9fLQmyClB9px6JC719G4pyYvJlF899lsKPalKVP246vLJYqKM1C+gZizpuhtC/PIOTP2oyua0OvXhHhgIevhaDhQN/N24tgP+hkfT/elDj/T0pFwOflRXRpDq9bIHN8oiy2FB4akPp2HQ0LokXRjSHC8uNqtBRxuOuB81tufxOvk/lTeCMiM308wlUf21AYxdD1OsBHO2UoLZXh+p+8H5tcZ2W7MULJYjtKxW5PCgBJgn7uLBmRN4vlAf4hPaLQkpH0ASLEwpkhGgBSaN3bb2/25gYvnVPll69LmzcQLjnZ2IEF7kH8UIwHCwari9bh9TFK2cLs2P9xeyLbuD5wCNgDU2cUCBLHPT6bBYiy5/X4SH/YjZrlg1dmJ2YfIkXCxkzWJidHrQlzo+BeShA6JHd8rQlTnZb4qRHdpsRpileaW1GSBCRzLLajDAn4pX1ZoR5YHo7Sm0nVYLIP1OYjHKwk2pCfgjxIgji54PEiyCIXELiRRBELiHxIggil5B4EQSRS0i8CILIJSReBEHkEhIvgiByCYkXQRA5BOD/9g455k9Xo4AAAAAASUVORK5CYII="
},
"4.PNG": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAT4AAABBCAYAAABIO8iQAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAAFiUAABYlAUlSJPAAAAtFSURBVHhe7Z3vZyPbH8e//2AejSWUUI0+qEtcNvoglq1LLBvL5kGlbK2v3get68rSWF9dKssqV5aV5UpZZaVUVgnLUCF8vuecOTOZbGbOfGYyaSed94uxkk5+nR/v8/l1zv6HAAAgZ0D4AAC5A8IHAMgdED4AQO6A8AEAcgeEDwCQOyB8AIDcAeEDAOQOCB8AIHdA+AAAuQPCBwDIHRA+AEDugPABAHIHhA8AkDsgfACA3AHhAwDkDggfACB3QPgAALnj/oVvOqHhZY+6px06+6dHvcsRTab6b5LbPvWvJ/oBAACkD0v4xu+rVCgUQq496v7UN5qYjqj3pkpFq0BWqUaNgxa15PWyQsWtOnVvxD03XapvFunw0nkJuE9G1K4E9a9zVd+P1V2Dt8XAv6vreZdsdRfwmPSoKcZ8YHsVLGp9lTfZ1H0e9HfnKr4dqLcC6RHP4vt2RCXVGRY1PvGHuP31iCpPxOue1Kh96UwgP/aXFpU2K1TZlO/doIs7/QfwAIhJWNeTbvuIrvzWuI/Ru4qemDXq3OongYErOlLjW1x1sUCEtGvvtb5no0X9kHvA8sQSPvt8Tw92ppUnGJ3XlVhalUMahGrlmM6e6Q6vtIXtAR6OAR1uOH1hHfT1c4t4lt/2CQ31c8DAzy7tqbkzs54X8Vl+Ly8IAZ/VEUv4vNWIKU7KkpP3b4rVK8JAdC0I02QD98BNmyqyz8TV+BQ29cbU2XXuQX8x+dx05k6hTCff9XML9Kml3eJwcQRpEEP4hnSy7XQKK+Zw06Gq6sQSL2anB0b4ZAP3weRTQ/VDoVChtoy7BjHtUVPdg/7iMjwu63ZtUi/MhfUWHZM4gjTgC5/PVI8c7NMRtZ8691rCZGdFA4V1WChUqfNDPwYPgufCWsJK188t8P2EymosGMQR+PC5sLsdYS8HM1t0DOIIUoEvfJ6pHj3YbdGBlrq3SIf/6icjUJ2+cUjIXz0kvsyuIUPrxXrRXzx8FrLJW/IWHWTHVw5b+DxTPXKwD6n9m9PJcSbG+EOdym8RL3pQ7i6ooSdo+Tg8ZdHft5z+RQCeh2chm7ylEXW0l2Rqe5AOTOGbBbMjB/t1m3Z0J1v7ELK14mtLW+oFan7Wzy0wi/VWT5F/5zCrgzV4S96iY1Hri34OrAye8PlM9ajB7i92RuA7HuZCcd5lvWDGVAOY1eZxLhSac/GqIVgXv1QMJIcnfJ6pHj3Y+/tuB6YV+LZp8GeT2t9XIKK3QxqlVCw9+tik5vmSFtB0QvZPe7krcTNN6OKl7jtDAH5Wy2koNL/pUvPNReh7JOJuQCcvjwy1oFllZiEvWxdp/3tEzXfD9MML9pCGaXXWmvQTS/hmlkjUrgrf5EkpMzU6rVHt/WpcqtFpK8Us8oT6BxXhpqxrWHpWQ8aK7z3tBNdy2uJ9fm9RfxW7b350qPZMfG7ScbXkwjK3p5zLbYeqaj7w4nuh4aEb/dv1w1T50krXvV62n+4BlvB5prrBEnDwpe1DExu+rTsL1y9WorA0d8ImmFgXO8/KVC7pifikROXfT8S7zxj8t+TErKwilbcb1P3ly5uEb3C8Q+Wtoo55WVTcatKFdEHGXapv+D5zWz8vuROTflMsDuuofb7CZU58Lzg7aYuFryQmUfAEn3xuifYqq/3aTpuWqenf+nh7Rnu6ba1SmSrHi58xOq3STsLgv/2pqT4/2VWhk2/6jeLAqYbwJZX2zgMGz1S0+2/hpV5Lt6tJ+MR4b4j3Lsktp/I7ijFf+cuZZYPjsm9+iPfVz0uW6af7gCF8sy1MnMJlz9UNFSwfvr2/i5NNWI+v5J7gCMNerC5qRQ1KuthiQD0TYhgiRNEWn5uhnrd07Y8NKh/0Avdbyux3aQ03lc9cWDHBwvbeRtVyysQWYwubYzUGF+kOj2vivQ3LqxQJa29hEcsqrLrIy0MqqnYNbpPJRUPVw0bMhOTtyrD41HcQ33HOG5gKI6ZSo861fuwn4/0ULXwsU32GVHpnAhk6WmOMF43PqMY6sCBYnGQRdeeFeascx9V1f4/324XLUT/ohycQZDxUWrsJzPzx/2peVjXplTS50T/QVqxJuDzrJVgcB2+EZc1Y5QMnkUBucaxHZoqdcErtwzooH68awksqBYqjs4+dM/cStyvH1Z3qUIg3PmwxZurUCY3jZ7ufIoVvVk1usAT8eOUs0dtuTC60EsVI19rBqTH0lwGITtmvRyZXWDE+v0Up41cvomIX4p6kGc8UkhuJ4lBiKHsBeEMJkme9BIYxpGfgHrMUgeva+b0CuaDsCytaPzShFqN1KPL1ubDh1RC+8FDQb1JWNnPuJW1XjvAJ/Bbl6LQuXmPugSz3U6TwmQd7EBPRQM5riq8NDS7dUEMwXVkg3DpAnXV2J61KiDBqzHjJDTfwXKLyrtmCdHDuD4zVZBWfCxv+vWcB+EDrRXkG4XGoedwkmL7fTYhwm0xM1DjF8Q+G58IaFkLXkhL3BVrLqrYy2ntySNiuTOGT90mLsri1Q3uM+ZXlfjILn/Th3V0Yf8RQbtngW3J1sKh6fLX4OvuKTnYtsp449yxaCU4HVt4xGleh45CykcVgqwgB5HxXnvAJZ+NDTbVB/SOvBaQly//uD4+MWToutsFKF+JYV/eE/Da1+PAz+W6YY+/8irr1msFlCkAlYrK/n/Xqzx31G43VEN+OvF0dQUkl5XHFOKotUbtyhW+q59nm0VwSMZQM91Og8HnxnqDLFKT1I8St88LJ+lgbVd+Jy1UqbuxQ41QIoozD7QYNCse6iLMzwDHDhZjGSKPzhE/GMnT2imm2qwQP11p9KLgnA7uHkgZcc3VpcnVnWyYCN4RglSNdpgXUa7nW5T3jy44vXq7LaqpsmD+SSrmLnEShS5J25QqfLKlRxgozlJPhfopObizL3ZAG//So+7FLPfFv/2rMiEPFFz4nQByvaJqX3JCxjLF2IWp0xgg6roXwpU1c4XN3A73u6SdikGXhS5nYwpekXTnC58a3vzlhpeIbhgOba+FLhBPw5buLOrYRM54QJXz+bJib5OFkqdbN1U0FFc+K4dbouGyiWOiauLppoFzXOKeSJ2nXKOGbq5DQiTA516Laf91c3SwQK7mhA8RxD0UwCZ8tVra5shVVlyQ6PHL1dazVtUpupEGs5Ia2ZJJaA9K6XIfkRhrESm4kbFeT8E2Ft/N6vmxl+LeMXTIOU8hwP2VW+OKUs7jZs7hiEyp8122qVk5oOLdSCavylYxv7FA7qGDTQ4gwo5Tn8RGjnEVb9EknhZrcjILeR0Gccpak7RomfFObevtlWviPxXTJmvXK3AdZ7qfMCp8TH4goYL7uUE1uJ/Jtxym/7PLEUvCr8PXfite729SsItXeu+UFtlj1Ztt2rA1x39N2cGbLLWDWD/OETDCZC5gn1DuQ27/8Wwl36CRWzaMT1siPRe14EOYC5iXb9Vfh+3FGe75talapRT3344UnVPFt5SyJ+5qBezSz3U/ZFT4hX93njC1rS8BJbsRFFlPv/G2a/I8YYXmXVv2/rqmQA2dHz+NhLLwfzpa1xJhc3aRkvJ8yLHwCOZHiZLRikrrwrfMhBakgF6vwQwrSQLpPWd78vhJk/dzmCrOjKxC+rPdTtoVPgGOp1gx3t8AqVvof2T/uaGXgWKpUybzwyfjaqg4iTVP4UjmI9LGAg0hXwsoOIk1T+B7TQaSPlcnNVWonMAOwtoyv6CrVVSr75Fr4AAD5BMIHAMgdED4AQO6A8AEAcgeEDwCQOyB8AIDcAeEDAOQOCB8AIHdA+AAAuQPCBwDIHRA+AEDugPABAHIHhA8AkDsgfACA3AHhAwDkDggfACB3QPgAADmD6P8ryaoSWU7fBQAAAABJRU5ErkJggg=="
}
},
"cell_type": "markdown",
"metadata": {},
"source": [
"## 3 随机森林的基础知识\n",
"### 1)信息、熵、信息增益\n",
"这三个基本概念是决策树的根本,是决策树利用特征来分类时,确定特征选取顺序的依据。理解了它们,决策树你也就了解了大概。\n",
"\n",
" 信息\n",
" \n",
"  引用香农的话来说,信息是用来消除随机不确定性的东西。当然这句话虽然经典,但是还是很难去搞明白这种东西到底是个什么样,可能在不同的地方来说,指的东西又不一样。对于机器学习中的决策树而言,如果带分类的事物集合可以划分为多个类别当中,则某个类(xi)的信息可以定义如下:\n",
" ![1.PNG](attachment:1.PNG)\n",
" \n",
" I(x)用来表示随机变量的信息,p(xi)指是当xi发生时的概率。\n",
"\n",
" 熵\n",
" \n",
"  熵是用来度量不确定性的,当熵越大,X=xi的不确定性越大,反之越小。对于机器学习中的分类问题而言,熵越大即这个类别的不确定性更大,反之越小。\n",
" ![2.PNG](attachment:2.PNG)\n",
" \n",
" 条件熵\n",
" \n",
" 条件熵是用来解释信息增益而引入的概念,概率定义:随机变量X在给定条件下随机变量Y的条件熵,对定义描述为:X给定条件下Y的条件概率分布的熵对X的数学期望,在机器学习中为选定某个特征后的熵,公式如下:\n",
" ![3.PNG](attachment:3.PNG)\n",
" \n",
" 信息增益\n",
" \n",
" 信息增益在决策树算法中是用来选择特征的指标,信息增益越大,则这个特征的选择性越好,在概率中定义为:待分类的集合的熵和选定某个特征的条件熵之差(这里只的是经验熵或经验条件熵,由于真正的熵并不知道,是根据样本计算出来的),公式如下:\n",
" ![4.PNG](attachment:4.PNG)\n",
"\n",
"\n",
"### 2)决策树\n",
"\n",
"  决策树是一种树形结构,其中每个内部节点表示一个属性上的测试,每个分支代表一个测试输出,每个叶节点代表一种类别。常见的决策树算法有C4.5、ID3和CART。\n",
"\n",
"### 3)集成学习 \n",
"\n",
"  集成学习通过建立几个模型组合的来解决单一预测问题。它的工作原理是生成多个分类器/模型,各自独立地学习和作出预测。这些预测最后结合成单预测,因此优于任何一个单分类的做出预测。\n",
"  "
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# 随机森林的生成\n",
"每棵树的按照如下规则生成:\n",
"\n",
"1、用N来表示训练用例(样本)的个数,M表示特征数目。\n",
"\n",
"2、输入特征数目m,用于确定决策树上一个节点的决策结果;其中m应远小于M。\n",
"\n",
"3、从N个训练用例(样本)中以有放回抽样的方式,取样N次,形成一个训练集(即bootstrap取样),并用未抽到的用例(样本)作预测,评估其误差。\n",
"\n",
"4、对于每一个节点,随机选择m个特征,决策树上每个节点的决定都是基于这些特征确定的。根据这m个特征,计算其最佳的分裂方式。\n",
"\n",
"5、每棵树都会完整成长而不会剪枝,这有可能在建完一棵正常树状分类器后会被采用)\n",
" "
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"随机森林构建\n",
"\n",
"决策树相当于一个大师,通过自己在数据集中学到的知识对于新的数据进行分类。但是俗话说得好,一个诸葛亮,玩不过三个臭皮匠。随机森林就是希望构建多个臭皮匠,希望最终的分类效果能够超过单个大师的一种算法。\n",
"\n",
"那随机森林具体如何构建呢?有两个方面:数据的随机性选取,以及待选特征的随机选取。\n",
"\n",
"1.数据的随机选取:\n",
"首先,从原始的数据集中采取有放回的抽样,构造子数据集,子数据集的数据量是和原始数据集相同的。不同子数据集的元素可以重复,同一个子数据集中的元素也可以重复。第二,利用子数据集来构建子决策树,将这个数据放到每个子决策树中,每个子决策树输出一个结果。最后,如果有了新的数据需要通过随机森林得到分类结果,就可以通过对子决策树的判断结果的投票,得到随机森林的输出结果了。如下图,假设随机森林中有3棵子决策树,2棵子树的分类结果是A类,1棵子树的分类结果是B类,那么随机森林的分类结果就是A类\n",
"\n",
"2.待选特征的随机选取\n",
"与数据集的随机选取类似,随机森林中的子树的每一个分裂过程并未用到所有的待选特征,而是从所有的待选特征中随机选取一定的特征,之后再在随机选取的特征中选取最优的特征。这样能够使得随机森林中的决策树都能够彼此不同,提升系统的多样性,从而提升分类性能。\n",
" \n",
"**本实验中随机森林用于回归的具体步骤:**\n",
"\n",
"1、从训练集中随机抽取一定数量的样本,作为每棵树的根节点样本;\n",
"\n",
"2、在建立决策树时,使用最小方差作为分裂规则,即随机抽取一定数量的候选属性,根据候选属性和对应的特征值划分成两个数据集,两个数据集的方差和越小,该属性越适合作为分裂节点;\n",
"\n",
"3、建立好随机森林以后,对于测试样本,进入每一颗决策树进行回归输出,每一颗决策树输出的均值作为最终结果。"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## 使用随机森林解决波斯顿房价\n",
"接下来,我们会使用自己构建的随机森林来预测波斯顿的房价,这是一个回归问题。\n",
"该数据集是一个回归问题。每个类的观察值数量是均等的,共有 506 个观察,13 个输入变量和1个输出变量。\n",
"每条数据包含房屋以及房屋周围的详细信息。其中包含城镇犯罪率,一氧化氮浓度,住宅平均房间数,到中心区域的加权距离以及自住房平均房价等等。\n",
"CRIM:城镇人均犯罪率。\n",
"ZN:住宅用地超过 25000 sq.ft. 的比例。\n",
"INDUS:城镇非零售商用土地的比例。\n",
"CHAS:查理斯河空变量(如果边界是河流,则为1;否则为0)。\n",
"NOX:一氧化氮浓度。\n",
"RM:住宅平均房间数。\n",
"AGE:1940 年之前建成的自用房屋比例。\n",
"DIS:到波士顿五个中心区域的加权距离。\n",
"RAD:辐射性公路的接近指数。\n",
"TAX:每 10000 美元的全值财产税率。\n",
"PTRATIO:城镇师生比例。\n",
"B:1000(Bk-0.63)^ 2,其中 Bk 指代城镇中黑人的比例。\n",
"LSTAT:人口中地位低下者的比例。\n",
"MEDV:自住房的平均房价,以千美元计。\n",
"\n",
"通过完成作业,你将会学到: 1、如何构建回归树; 2、如何利用回归树建立集成模型(随机森林);3、如何评估模型效果。\n",
"\n",
"```不要单独创建一个文件,所有的都在这里面编写(在TODO后编写),不要试图改已经有的函数名字 (但可以根据需求自己定义新的函数)```\n",
"\n",
"在本次项目中,你将会用到以下几个工具:\n",
"- ```sklearn```。具体安装请见:http://scikit-learn.org/stable/install.html sklearn包含了各类机器学习算法和数据处理工具,包括本项目需要使用的词袋模型,均可以在sklearn工具包中找得到。 \n",
"- ```numpy```,数据处理库:www.numpy.org\n",
"- ```joblib```,这是一个可以简单地将Python代码转换为并行计算模式的软件包,详情见https://pypi.org/project/joblib/"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### 1.载入需要的包和数据"
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {},
"outputs": [],
"source": [
"#导入需要用到的算法库\n",
"import numpy as np\n",
"from numpy import *\n",
"import random\n",
"from sklearn.model_selection import train_test_split\n",
"from sklearn.datasets import load_boston\n",
"from sklearn.metrics import r2_score\n",
"from joblib import Parallel, delayed\n",
"import warnings \n",
"warnings.filterwarnings('ignore')"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# TODO:导入波斯顿数据并简单探查数据\n",
"boston = load_boston()\n",
"\n",
"\n"
]
},
{
"cell_type": "code",
"execution_count": 10,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"'2'"
]
},
"execution_count": 10,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"label_dict={\"1\":0,\"2\":3,\"3\":2}\n",
"list(label_dict.keys())[list(label_dict.values()).index(sorted(list(label_dict.values()))[-1])]"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### 2.构建自己的随机森林\n",
"随机森林模型整体是一个myrf的类,所有的参数都包含再类中的成员变量和成员函数中。\n",
"\n",
"需要同学通过补全各个模块的代码使随机森林能按照回归树的策略建立起来。"
]
},
{
"cell_type": "code",
"execution_count": 7,
"metadata": {},
"outputs": [
{
"ename": "SyntaxError",
"evalue": "invalid syntax (<ipython-input-7-653daea8a3b0>, line 40)",
"output_type": "error",
"traceback": [
"\u001b[1;36m File \u001b[1;32m\"<ipython-input-7-653daea8a3b0>\"\u001b[1;36m, line \u001b[1;32m40\u001b[0m\n\u001b[1;33m dataSet =\u001b[0m\n\u001b[1;37m ^\u001b[0m\n\u001b[1;31mSyntaxError\u001b[0m\u001b[1;31m:\u001b[0m invalid syntax\n"
]
}
],
"source": [
"class myrf:\n",
" # 存放树的列表\n",
" trees = []\n",
" # 随机种子\n",
" random_state = 0\n",
" # 树的个数\n",
" n_estimators = 10\n",
" # 最大特征数\n",
" max_features = 10\n",
" # 最大深度\n",
" max_depth = 10\n",
" # 切分新节点所需的最小阈值\n",
" min_change = 0.001\n",
" # 当前树的数量\n",
" cur_tree = 0\n",
" # 最小分割\n",
" min_samples_split = 0\n",
" # 叶子内节点的最小数目\n",
" min_samples_leaf = 0\n",
" # 每次建树时所用的样本占总样本的比例\n",
" sample_radio = 0.9\n",
" # 每次建树时所并行化处理器的个数\n",
" n_jobs = 10\n",
" # 计算y的方差\n",
" # 本来是要除总样本数的,考虑到对于所有的叶子来说,总样本数都是一致的,所以不除也可以。\n",
" def get_varience(self, dataSet):\n",
" return np.var(dataSet[:,-1])*shape(dataSet)[0]\n",
" \n",
" ## TODO:计算y的均值\n",
" def get_mean(self,dataSet):\n",
" \n",
" return\n",
" \n",
" # 根据特征和特征值,把样本划分成高于该特征值的部分和低于该特征值的部分\n",
" def SplitDataSet(self, dataSet,feature,value):\n",
" ## TODO:将数据集dataSet 按特征feature从小到大排序\n",
" dataSet = \n",
" ## TODO:将数据集按特征值划分成高于特征值的部分和低于该特征值两个数据集\n",
" \n",
" return\n",
" \n",
" # 选取最优的特征和特征值边界\n",
" def select_best_feature(self, dataSet):\n",
" #计算特征的数目\n",
" feature_num=dataSet.shape[1]-1\n",
" features=np.random.choice(feature_num,self.max_features,replace=False)\n",
" # 最好分数\n",
" bestS=inf;\n",
" # 最优特征\n",
" bestfeature=0;\n",
" # 最优特征的分割值\n",
" bestValue=0;\n",
" S=self.get_varience(dataSet)\n",
" ## TODO:判断样本数量是否足够,如果样本量少于最小分割,或者样本量少于叶子内节点的最小数目,就返回数据集的平均值结束程序\n",
" if :\n",
" return None,\n",
" \n",
" # 遍历所有特征,\n",
" for feature in features:\n",
" ## TODO:将数据集按特征feature从小到大排序\n",
" dataSet = \n",
" # 遍历数据集中的数据,控制叶子节点数目\n",
" for index in range(shape(dataSet)[0]-1):\n",
" ## TODO: 排除dataSet数据集中的重复值\n",
" if :\n",
" continue\n",
" #将数据集按index分为前后两部分\n",
" data0 = dataSet[0:index+1, :]\n",
" data1 = dataSet[index+1:, :]\n",
" #判断样本数量是否足够,如果样本量少于最小分割,或者样本量少于叶子内节点的最小数目,就跳到下一个循环\n",
" if shape(data0)[0] < self.min_samples_leaf or shape(data1)[0] < self.min_samples_leaf:\n",
" continue;\n",
" #将两个数据集分别求取方差并加和作为新的分数\n",
" newS=self.get_varience(data0)+self.get_varience(data1)\n",
" #如果最好分数大于新的分数,将新的分数赋值给最好分数,保证方差越小越好\n",
" if bestS>newS:\n",
" bestfeature=feature\n",
" bestValue=dataSet[index][feature]\n",
"# print(bestfeature, bestValue)\n",
" bestS=newS\n",
" #如果误差不大就退出,说明无法分割\n",
" if (S-bestS)<self.min_change: \n",
" return None,self.get_mean(dataSet)\n",
"# print(bestfeature, bestValue)\n",
" return bestfeature,bestValue\n",
" \n",
" # 搭建单颗决策树\n",
" def createTree(self, dataSet, max_level, flag = 0):\n",
" if flag == 0:\n",
" seqtree = self.cur_tree+1\n",
" self.cur_tree = seqtree;\n",
" print('正在搭建第',seqtree,'棵树...')\n",
" #选择最适合的特征和值来构建树\n",
" bestfeature,bestValue=self.select_best_feature(dataSet)\n",
" if bestfeature==None:\n",
" if flag == 0:\n",
" print('第',seqtree,'棵树搭建完成!')\n",
" return bestValue\n",
" retTree={}\n",
" max_level-=1\n",
" if max_level<0: #控制深度\n",
" return self.get_mean(dataSet)\n",
" retTree['bestFeature']=bestfeature\n",
" retTree['bestVal']=bestValue\n",
" ## TODO: 使用self.SplitDataSet将数据集按bestfeature和bestValue分割成左右两棵树数据集\n",
" lSet,rSet=\n",
" # 使用self.createTree将左右两棵树数据集构建成树\n",
" retTree['right']=self.createTree(rSet,self.max_depth,1)\n",
" retTree['left']=self.createTree(lSet,self.max_depth,1)\n",
" if flag == 0:\n",
" print('第',seqtree,'棵树搭建完成!')\n",
" return retTree\n",
" \n",
" \n",
" # 初始化随机森林\n",
" def __init__(self, random_state, n_estimators, max_features, max_depth, min_change = 0.001,\n",
" min_samples_split = 0, min_samples_leaf = 0, sample_radio = 0.9, n_jobs = 10):\n",
" self.trees = []\n",
" self.random_state = random_state\n",
" np.random.seed(self.random_state)\n",
" self.n_estimators = n_estimators\n",
" self.max_features = max_features\n",
" self.max_depth = max_depth\n",
" self.min_change = min_change\n",
" self.min_samples_leaf = min_samples_leaf\n",
" self.min_samples_split = min_samples_split\n",
" self.sample_radio = sample_radio\n",
" self.n_jobs = n_jobs\n",
" \n",
" # 向森林添加单棵决策树\n",
" def get_one_tree(self, dataSet):\n",
" X_train, X_test, y_train, y_test = train_test_split(dataSet[:,:-1], dataSet[:,-1], \n",
" train_size = self.sample_radio, random_state = self.random_state)\n",
" X_train=np.concatenate((X_train,y_train.reshape((-1,1))),axis=1)\n",
" self.trees.append(self.createTree(X_train,self.max_depth))\n",
" \n",
" # 并行化搭建随机森林\n",
" def fit(self, X, Y): #树的个数,预测时使用的特征的数目,树的深度\n",
" dataSet = np.concatenate((X, Y.reshape(-1,1)), axis = -1)\n",
" Parallel(n_jobs=self.n_jobs, backend=\"threading\")(delayed(self.get_one_tree)(dataSet) for _ in range(self.n_estimators)) \n",
" \n",
" #预测单个数据样本\n",
" def treeForecast(self,tree,data):\n",
" if not isinstance(tree,dict):\n",
" return float(tree)\n",
" if data[tree['bestFeature']]>tree['bestVal']:\n",
" if type(tree['left'])=='float':\n",
" return tree['left']\n",
" else:\n",
" return self.treeForecast(tree['left'],data)\n",
" else:\n",
" if type(tree['right'])=='float':\n",
" return tree['right']\n",
" else:\n",
" return self.treeForecast(tree['right'],data) \n",
" \n",
" # 单决策树预测结果\n",
" def createForeCast(self,tree,dataSet):\n",
" seqtree = self.cur_tree+1\n",
" self.cur_tree = seqtree;\n",
" print('第'+str(seqtree)+'棵树正在预测...\\n')\n",
" l=len(dataSet)\n",
" predict=np.mat(zeros((l,1)))\n",
" for i in range(l):\n",
" predict[i,0]=self.treeForecast(tree,dataSet[i,:])\n",
" print('第'+str(seqtree)+'棵树预测完成!')\n",
" return predict\n",
" \n",
" ## TODO: 使用self.createForestCast函数更新预测值函数\n",
" def unpdate_predict(self, predict, tree, X):\n",
" predict+=\n",
" \n",
" # 随机森林预测结果\n",
" def predict(self,X):\n",
" self.cur_tree = 0;\n",
" l=len(X)\n",
" predict=np.mat(zeros((l,1)))\n",
" Parallel(n_jobs=self.n_jobs, backend=\"threading\")(delayed(self.unpdate_predict)(predict, tree, X) for tree in self.trees)\n",
" # 对多棵树预测的结果取平均\n",
" predict/=self.n_estimators\n",
" return predict\n",
" \n",
" # 获取模型分数\n",
" def get_score(self,target, X):\n",
" return r2_score(target, self.predict(X))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### 3.实例化随机森林并对boston数据进行训练、预测"
]
},
{
"cell_type": "code",
"execution_count": 8,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"正在搭建第1棵树...\n",
"\n",
"正在搭建第2棵树...\n",
"正在搭建第3棵树...\n",
"\n",
"\n",
"正在搭建第4棵树...\n",
"\n",
"第2棵树搭建完成!\n",
"正在搭建第5棵树...\n",
"\n",
"第1棵树搭建完成!\n",
"第3棵树搭建完成!正在搭建第6棵树...\n",
"\n",
"\n",
"正在搭建第7棵树...\n",
"\n",
"第4棵树搭建完成!\n",
"正在搭建第8棵树...\n",
"\n",
"第5棵树搭建完成!\n",
"第7棵树搭建完成!\n",
"正在搭建第9棵树...\n",
"\n",
"第6棵树搭建完成!正在搭建第10棵树...\n",
"\n",
"\n",
"第8棵树搭建完成!\n",
"第10棵树搭建完成!\n",
"第9棵树搭建完成!\n"
]
}
],
"source": [
"# rf2 = mycache(random_state=2, n_estimators=10, max_features=3, max_depth=10, min_change=0.001, min_samples_split=20, n_jobs=10)\n",
"rf1 = myrf(random_state=2, n_estimators=10, max_features=3, max_depth=10, min_change=0.001, min_samples_split=20, n_jobs=-1)\n",
"## TODO: 使用rf1的fit函数对训练特征数据boston.data和训练目标变量boston.target进行训练\n",
"\n"
]
},
{
"cell_type": "code",
"execution_count": 9,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"第1棵树正在预测...\n",
"\n",
"第2棵树正在预测...\n",
"第1棵树预测完成!\n",
"第3棵树正在预测...\n",
"\n",
"第4棵树正在预测...\n",
"\n",
"\n",
"第5棵树正在预测...\n",
"\n",
"第5棵树预测完成!\n",
"第6棵树正在预测...\n",
"\n",
"第3棵树预测完成!\n",
"第7棵树正在预测...\n",
"\n",
"第7棵树预测完成!\n",
"第8棵树正在预测...\n",
"\n",
"第2棵树预测完成!\n",
"第8棵树预测完成!\n",
"第4棵树预测完成!\n",
"第9棵树正在预测...\n",
"\n",
"第9棵树预测完成!\n",
"第10棵树正在预测...\n",
"\n",
"第10棵树预测完成!\n",
"第6棵树预测完成!\n"
]
},
{
"data": {
"text/plain": [
"0.9396619095484052"
]
},
"execution_count": 9,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"## TODO:使用rf1的get_score函数对boston.target和boston.data进行预测并评价结果\n",
"\n"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.7.4"
},
"toc": {
"base_numbering": 1,
"nav_menu": {},
"number_sections": false,
"sideBar": true,
"skip_h1_title": false,
"title_cell": "Table of Contents",
"title_sidebar": "Contents",
"toc_cell": false,
"toc_position": {},
"toc_section_display": true,
"toc_window_display": false
},
"varInspector": {
"cols": {
"lenName": 16,
"lenType": 16,
"lenVar": 40
},
"kernels_config": {
"python": {
"delete_cmd_postfix": "",
"delete_cmd_prefix": "del ",
"library": "var_list.py",
"varRefreshCmd": "print(var_dic_list())"
},
"r": {
"delete_cmd_postfix": ") ",
"delete_cmd_prefix": "rm(",
"library": "var_list.r",
"varRefreshCmd": "cat(var_dic_list()) "
}
},
"types_to_exclude": [
"module",
"function",
"builtin_function_or_method",
"instance",
"_Feature"
],
"window_display": false
}
},
"nbformat": 4,
"nbformat_minor": 2
}
{
"cells": [
{
"cell_type": "code",
"execution_count": 1,
"metadata": {},
"outputs": [
{
"ename": "NameError",
"evalue": "name 'null' is not defined",
"output_type": "error",
"traceback": [
"\u001b[1;31m---------------------------------------------------------------------------\u001b[0m",
"\u001b[1;31mNameError\u001b[0m Traceback (most recent call last)",
"\u001b[1;32m<ipython-input-1-e0b6f890ec65>\u001b[0m in \u001b[0;36m<module>\u001b[1;34m\u001b[0m\n\u001b[0;32m 195\u001b[0m {\n\u001b[0;32m 196\u001b[0m \u001b[1;34m\"cell_type\"\u001b[0m\u001b[1;33m:\u001b[0m \u001b[1;34m\"code\"\u001b[0m\u001b[1;33m,\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[1;32m--> 197\u001b[1;33m \u001b[1;34m\"execution_count\"\u001b[0m\u001b[1;33m:\u001b[0m \u001b[0mnull\u001b[0m\u001b[1;33m,\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0m\u001b[0;32m 198\u001b[0m \u001b[1;34m\"metadata\"\u001b[0m\u001b[1;33m:\u001b[0m \u001b[1;33m{\u001b[0m\u001b[1;33m}\u001b[0m\u001b[1;33m,\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0;32m 199\u001b[0m \u001b[1;34m\"outputs\"\u001b[0m\u001b[1;33m:\u001b[0m \u001b[1;33m[\u001b[0m\u001b[1;33m]\u001b[0m\u001b[1;33m,\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n",
"\u001b[1;31mNameError\u001b[0m: name 'null' is not defined"
]
}
],
"source": [
"{\n",
" \"cells\": [\n",
" {\n",
" \"cell_type\": \"markdown\",\n",
" \"metadata\": {},\n",
" \"source\": [\n",
" \"# 随机森林\\n\",\n",
" \"## 1 什么是随机森林?\\n\",\n",
" \" 随机森林就是用随机的方式建立一个森林,在森林里有很多决策树组成,并且每一棵决策树之间是没有关联的。当有一个新样本的时候,我们让森林的每一棵决策树分别进行判断,看看这个样本属于哪一类,然后用投票的方式,哪一类被选择的多,作为最终的分类结果。在回归问题中,随机森林输出所有决策树输出的平均值。随机森林既可以用于分类也可以用于回归。\"\n",
" ]\n",
" },\n",
" {\n",
" \"cell_type\": \"markdown\",\n",
" \"metadata\": {},\n",
" \"source\": [\n",
" \"## 2 随机森林有什么特点\\n\",\n",
" \"1)对于很多种资料,它可以产生高准确度的分类器;\\n\",\n",
" \"\\n\",\n",
" \"2)它可以处理大量的输入变数;\\n\",\n",
" \"\\n\",\n",
" \"3)它可以在决定类别时,评估变数的重要性;\\n\",\n",
" \"\\n\",\n",
" \"4)在建造森林时,它可以在内部对于一般化后的误差产生不偏差的估计;\\n\",\n",
" \"\\n\",\n",
" \"5)它包含一个好方法可以估计遗失的资料,并且,如果有很大一部分的资料遗失,仍可以维持准确度;\\n\",\n",
" \"\\n\",\n",
" \"6)它提供一个实验方法,可以去侦测variable interactions;\\n\",\n",
" \"\\n\",\n",
" \"7)对于不平衡的分类资料集来说,它可以平衡误差;\\n\",\n",
" \"\\n\",\n",
" \"8)它计算各例中的亲近度,对于数据挖掘、侦测离群点(outlier)和将资料视觉化非常有用;\\n\",\n",
" \"\\n\",\n",
" \"9)它可被延伸应用在未标记的资料上,这类资料通常是使用非监督式聚类。也可侦测偏离者和观看资料;\\n\",\n",
" \"\\n\",\n",
" \"10)学习过程是很快速的。\"\n",
" ]\n",
" },\n",
" {\n",
" \"attachments\": {\n",
" \"1.PNG\": {\n",
" \"image/png\": \"iVBORw0KGgoAAAANSUhEUgAAAPMAAAAwCAYAAADEikwlAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAAFiUAABYlAUlSJPAAAAr9SURBVHhe7Z3tZyNrFMDvP5hPo4QSasdyYxllo2wsG5dYGqUpK6VRK/3QuFZKox+yVEqFSqmUK0uFlaVSSyjDCuHcc56XaTKZyVsnTTs9P4bNdDKZmec57+eZ/QsYhgkFLMwMExJYmBkmJLAwM0xIYGFmmJDAwswwIYGFmWFCAgszw4QEFmaGCQkszAwTEliYGUbT70JtOw7Jk47aETQtKP5tQf4/W30OlpmEuXuSgEgk4rOloHqvDmRCgX2eBmNwjN+VoK3+Fj5saHyJQexLA/+1QOwaZIwY5K6C/5X5LPNNAWJigA3InC/01plnQOtrTAizsdtQe8JH5zgBxloOGn/UjgVin2fAMND4ddWOgJhLmO3TlNLWbI3DTw9qm9IyZ857al/IuCtDwnhKw9SFykdUjp+q+K/gmEuY69vK7bJKsKjognkuNCBn0HgnoHyndoWKHrrXUYis5qHZV7uegN5VDqKRKMbPakcAzCHMbSi+lcIc3W+qfUxo+VkEkxQ3TXa1K1R0K5DE+4t/e+JsQL8J+VV8rmidg/IHZhfm+yqkaHBxC63bxTg4IdV2Xe0JFxQrRyImFH+qHU9Icx89ggBD1dmF+TIrBzdiQelW7WNCiw6pUqdhTHTaUP2E92fkMJhYAkqWgjKKMwtz+9CUwhxWt4sZQIdUEyzXfRuqBymIrxpibhirccgct8a7j/ctqOwkIErx+Io6Hs/TuG7P53baTShsRMGIGGB+rkKH4t+7OuTFPrqmBBSuXWf+U4MMzeVpXN2+Da3jDMRX6HkYEN+sQBt/o3uZhwTdtxGFxEFztmunxBtdW0BVghmFuQvlDboZ3DZrwE52yPktJ9s4y2Vf58FCgYyly9BUqdn2SQqiOOETx97pUfumCAnxHRQImkT3dcitGWDoRNtvedz0yCRW/BDjXmXt4h9TYL5JQ/lGilf7G7nTMcj/EB8lKh8wOfejatBbqCToekWtGK9/3YLYWg7q6CZ3TpJ4/ujw+SfSgBw93/flQBLJswlzvw5Z+nHc/AaKwSHalRbqMZt52FJnWx6984y8Hh/FbV/lRL+BsYmWbSgTrC26Rzx4S2Ug/Nu7orBsmu53EgbcP4/HJ/I4SaiQMsFrEucxUCkMhoHaCm4N3IsS/ElzuXuaAuN9SVp7QQfK7+n+DMjU8Gxa6eHn3LU6ZCo6ULLoe8G4+bMJs85sjtNA/Q5Ud/JQC7gg/lR0zrKQPX2kour30GW0H7X1hoRjOcgEjc9k76IAkVB6Nj/oyR6B7KXaJcD9G1LRpc+GHVJHccyRaBNJOuUqd44scR6TrPQgtyWw6PwDykImvybFrG0ovUMhvVIfCceoKWX1pwH5dROsndqMdWP9nLJQD2C8ZxLmh3bODNQ8O2XIHbFQO417OM8ddNl28R4W0G63MB6pPLwVh7Ya3vFyc08K+pClc9DfdSmC//LofnvPH604dKLNvi5CZsMC820MDIypUwd16PpM+N5tAxq/6HsqoeVlIZ3E7YMV1MI8JKgjdKF10Rr+bW3UJrnHPwpgGjEo3KjPI2hhDqaGP5MwO80iG2VPDURtatTb6ifKzcM4xjEyIUEPPPomCzXSbKjl0yp5ElmJ4QCq/csCNW1uDSfcC5Fn+zyLz8ycc7Og6DXZdAmS4mW3EPV1I4mPVdOJJdwGBaXxRY3xSMJpWHH0rnNgDfZI/yoJ13xix5RzXaPKwkncDgjgdMI8iv7epFi7R/mED0Vo+raILk2YVZHb9ybIHZmmXkfHjT5w+ywD5m7dFXstDxr82GtuikGBEkrXK9PrhFs+ySqMW6XCHoyZH5qNrCOXPdOKQ7nAwmgYFpR+yT8TUhFMSDDp6/JQFtrtH5y78wmzbm91ud5zsSxhdoJ8H21MLtTbIg7ZZEZildsypHcXvFplVmhi0OSaQ7k0915+AkxbMil4PWieFKCmJ5xOMvkkq7xdcJW5xW1ECLTwq0Rbc58Wdgy7p9MIng4DR+Jlp9FpOJs9Mg+nwbH+ruSeTWW1jq9X6s2SYmYnQeGjjSnmmbpepjKLYvBsfDif0fV5Jhb5AZp8s5YaFC8+AaZjT3X/wm0esB4/VOzrFTPSsSIx5rY22rMbtUKtA6k4BhtTeu6YWiiIcePxsCDELfA6U+520/WcHpvN/tOE4gcMR/6pyHvV1t+1LoF+w1Fet1XIrKNnt2pCYez8WVI2WycovLWxrD9PX67SGgnj4w28kUWY5H4X6ntJiKPrH/s7DdUbmXE0Y3Ec7Gl+UF5jODufJqHHR4ZCYtIPrmV2BNYdT8t6LLmgowv8tbC5hFmXqsY1plAOAxXB+JhZW0yXpRX5D7pWj6y7UkrjYl+nndWQz6L9LS4/Y0zvQH3Wa9rqYxi5gZbWlvdr7Pmf2/FWnrTO3G9BQcS5uP3jEUOpeGi4DDEerS3dJYqgaO5bkLuU55blCqpDtqD4Bl3gwYEYA8VuI/HdqwAn4haFCnEoXMmGjtTpsCR00KWlbqvEUVt6Eb0OVLdNMAwT0n6lPRQeqksn1KIG+wbDKxI0mlc+LrtWEMZGEVrjpoouPRmGqAmLGvY9jjeVwlaSUB6Ivx2m6ACT1tuE7AXe/30dsmsWWBaeU9ede22opE2wDprSKt8UwCLlIBSeqkP78ZQdYGObH4a6gqSGmT4hgAO0iwNP5xnzIFtHKUh9mn4r6RiLHuTHiqPFRUJF/E4Hal8x9nOaCVBJ4UT1W4bW+ILfm1LwQ8d9AwrrNP4GxPe88xmdizwkY2qOrMQguVNxusD8cNofqZqBFqxylJUuu099meJakzrFJoQd2oJG9+vQPLDk3KJr2qtC27cyojyQcb3Z9Cqhnbg4n2gJpVf+OK2jcl/+YlR5CWM1Keey7N5sb2YT5s5xGo/FhyTcLtW5szBkrOYXAvTQHfLjVQvzE6ETbV7hDHWYJQYrHD+rUPVxxXXZaxbvkJBu8zRVmFmQLx+YVLpa/qopT6TATPMgaYDSSrB0AiL5fYHSLBIW8w3W63WzF0C/55HU07H5aFKMYumsa+FC5yjpkwDTZS88z6x93b9KEMc5GOh6ZnHOuCitkYX2dLXVembvppv5CEiYp0uAUYP9UAlKJ1J8EgBzu9mU/Pq3DE3yiMj9GnCjumdZyF/jxKIOo80EWHg93g/zNSfAAuauAklabWSkh6zQQ293bdiN/43zAsOf2FBzC8bNfsKqqyO+cfc4VH5gkks8A6JERtnuPioZSvB6NY2IbjjXwo9HEpAwS5dhbCBPXTzWcHO98yCVFgsKWW+kOqV0dxxlQWWwdRq0NpS+4gSiB+q7+otCh6Ddr9eJ0wasxwGtdPsiJ1ZbGRaOx5Ak67KY1+Yd29pnafn3rTlfoCCUQYDvACPltRpFBZSUMfYI6h1gbiX2SAITZlH492gaaeyjVtUtnEYUkif6CBtq2ybExPpQSiTgce9LEEirxF0VMh9SkEmjW3bZgurnOCQ2M5D8WICGsgw9lGCqXfomH3TTiPrIPIJ+B2o7SXyeKllGya93Kch/bz2u40/Xu13bPC4zJdqe9u2cwbcLByfMotb2giwZdfKs0ANFl9tDnikx8+TvhWKWiFqzPNgTvgie3XuzfaB1n+MWWjwrqPd4qwbdWm60S+eFLbRgAoLKUFsmJF7D/2gxmRe0BPKmCNZWHvKHbk38ApdAMgwSsDAj/HIChlkKwQszwzBLgYWZYUICCzPDhAQWZoYJCSzMDBMSWJgZJiSwMDNMSGBhZpiQwMLMMKEA4H9khDugzrIaIwAAAABJRU5ErkJggg==\"\n",
" },\n",
" \"2.PNG\": {\n",
" \"image/png\": \"iVBORw0KGgoAAAANSUhEUgAAAiUAAAAzCAYAAABIfXhEAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAAFiUAABYlAUlSJPAAABSBSURBVHhe7Z3/hxvbG8c//2B+GiUsoRrLjRKlURqlcYlL43JTKkvXdaU/dF2VS9f1kVLpxwqVUlkqS4VKqVQJS6iwPJ/nOV+SmcyZb5lJdjL7vDjszmaTnDnnOef9POc5Z/4DDMMwDMMwKYBFCcMwDMMwqYBFCcMwDMMwqYBFCcMwDMMwqYBFCcMwDMMwqYBFCcMwDMMwqYBFSYaZf2xBqZCH/KNTmF6O4fRJGYqHRSgUKtD+NFevYhhmF7A9MkwwLEp2yGw8gMH7cKX3ug2to1ao0u5P1SfYGcHx/TaMrwbQzFlg/dKE/g/5l+GzHOQe92Bfh8HZWRNKNLCr35NmflaHUr0H0yt1Yctsuz7jlyUo/zna2/beDjMYG+zOXHpw+pfZ9tylDf3v6iMcsD1uCvffYLLUBqFFyeJ9E6wcGo+pWC0Y0osue1Az/V2UPBxfiLe6sUxelVb347BmGND8ShNq94pQPLBs91SVg2MYrU+gn9uiE8HFMeTp3n9S15HBH/g/T/qwUL/vE+RtFm5jf9uydUzQCK0HaOQRhMns34q7bZalBr1L9UIbu6nPHPpPLCg8G/LAvmQCnbur9ik+NtmcT3lag/JhEfKWvY1lyT9Hu1uH7TEGEfvv+ASKjjZpQH8fb25IsjaGRI+UoHEVVGPX33p8PeENyNfkj4RcYYirKZw+0KKiAK2PGzbvAr28d22o3dHv5Rzk7Ez/KUvRon4XHttBDqpvZur3PeLbKVQsvG/nOxhhriZwgpNWgSaSqCxtxILGmU8b77I+8z40LAtqb/ew3beFuP9ynMrFGNQXszH0XtSgqN/L5CQo2B43ZIP+O39bk+2BzkVme30Gx5DIomTZ0LkKnKrwowtbxKRxlmGJugkJDYSCqzmMX1VFBMv63eRpzaH3GP+G6nbJF/IiqtDdOyuVgs5CjzLOLYvEJ/JqC5EjfCsbMUdHJLuvz+xNFXIWfqfMjtDRmb6uLCPAsb3AyzF0HpGjgGLU6JqzPcYhav+dvCyKdi2+nKgrWSObY0hkUSJCjWTEhyfg2dTnLWXoPsLlBjM/aywHQutxL7aKFwOr1YD+T3VB8xOVLX6GXRiO/sxD7lEXP3MCnTKKInU97Sw+tlxh7+2DRn8f20ncr/AsbaTcwXcwcy31+TmEFnrlxuWFG4sMS4v2QjER2wtU0VCjk8D2GI9I/XeG7SDbtPVRXcoYWR1DIoqSCZwcqgH3j4G65kaEKOk1fsLlRjOH4bPCciCsvI6bniQH1tr6cppYv7YLQznJVv6diQ4t1rj3ghl0H+G9utvZeX8SngEZfuhoycpG8p739/rqM3qOk6BvBOcGMseB9rZss5yF9vJNXd8UEeY23GO2x9iE7r9KAGa3r2d3DIkmSmzLMmRIZhY4QcrX+AmXGw8pTj0Q4kDViTsQjrvQ+u/Y4Z1R4qWF3rq9045flSF/UITyk93tLonN91Oo4H26ljCs6vPeAmONMEuX11kfMTHm3AL2hrM4by1z5XL3O7FtY/ymBd3PzvZne0yAsP1Xvc4vWrnXZHgMiSZKPjTVJOrnOcrELXodD3wBfOuIjiXuKeWXrC+/MAKZo2FB61xd2ClT6JSxfcJG/ZY2UvYUmtdaH52Evqe7PbbJ9J/V7qnC0ZDvjwf70H91tD60M7FnZHkMiSRKdOKQ2GLlNYEqBZfLFeHki7rGeJJoot0eMH3bgNItrO+tsjwwipJ1X6traGSlJ12YODxGHXkLGS78PoDjB3mZ/HtQgfY5fsacDqoqyWu/NKAXMSolzpEImR+1tBHHDgs7110flSejt/EzK5LaHbc3zGH0oiK2NVt36rIf4T0YPJfXclYeKi9wTIphj/PPp9D4Rd5T6qvdr3jxh+7TFuQftGEU6TaH6b+raL3vRgsceya0i/GutC+qb+nJKYz9vg/dnxc1OV7R/Xk+gNliCqP3Y5htFOnKahtsTgRRohOHsPgl8J015GvoC+9LOPJamUHvsR4I89D8kOGBkAQrJeTOVfTBqkDtUR7KRwMZur5EBX6AhuNI/lU5Gp6TvA3a2XSrAicX8r/prJGcVYZy2YLKPxO0TrVkFvGgKhKO1D7BCXM2G/H0Iq6/PkuRZTzk64Yz60FN7447aMIgw+YoEiXvnqAToDzfu1Wo3SlC/fVYToJfZSTXuS0+fP+V52eQaCZLUAnFt7H/3kbBR+McjgdVfP+o0Yzg/ot2IdrQx5G4HMIx2lHudh1OlX3B11Oo0fjjdT7Rjz400N6s8jEM6V+uJtChpGaLxu/NohbZbYPNCS9K9E0LW/bhhMIfOqoTo1h1nGTV+23KXE7G8v2yO1nQTgOZta2UNtZ3PTokO3sJOqTmCd3v7gecVng1guPba6IODVLcU5UMNjxS4i/quQVqSSYwIdlmI56vTUF9dGg70q6EtNjKDphje4t8BCxiglLXs4XcnizPR8FJXNR3Pelee8S2yHjY/ivEnTNXTot7uTNpJeCtiGdZBfZfsc0a39tryVWLeazXeqRBRzpdqQe2/3H0YXFsANVjk8TPDLdBDMKLEt3QvopwlU/inYBD4aomdL7EXY2aQu9pM2Lo2s1iPof5ZYwyj1sPiVC04v6qgTBzUSbqGyoXaTl5u5cB9Xba5aCglwMD1i8X/QZYa0a6vq48/W8Nioe16H1PiYHyP75DgM1GfHKuUlAfPTA1P6gLIUmLrWyfpHfHpRCRkK0m0m8dKFNdXZO4imja+3PI/kunVzvOY0Gctr2A4Z9lKN5rQj+ShxDcf/U5QeufL9GRafP2b/3ezk0a+F2PZH8ovhirawpt85sk1Ga4DeIQWpSsjtB2TyRL9M0i4eKhoKavq1D9NyEjpx0s91pbPl53d4z+1ANhDkrXkVW9VehZI2rdVWfGuzx8bYC2SIPuU8YBZsXi2xCGX+0dQa+7evdFCQ5SdQusuk9kT0coAr5DJBvZVn3QO6rfsqD+ztso9ICS+sn2p0HYhC1xBZCIVNH9plLKXn7czykMzyeizy8n8XVv2Rb5W/a5kP2XnvM1diyd6K3yAZHgBPqvnniN+SQ6smGPPNjQwt8RhZh1xRKHKU9S37vl8sfXHrR+LUOJHkFwKw+Vp13vHJUMt0EcQouSUAdC6XwSr1AWqsqST8hp8m8VPb+CTDq6VRBP0JQFr90qQe3v9YQf/EyKMKBi9L59e4Qj0W4fT3kMh+7Qrmja8myB6Abo4goFq1hXDgqrTqGLg0jjnY9xhRQlSxvxW07Zdn2+d6F2rwE9n0FnP0TJHPp/aPvfoNw7gTWfNjr205cjHqC3TwyfyTHHNYkvI3+2SWzT/qv/LygPInb/1dF688Sr62qOMqwSZO31059n+u4OAYR1rNJD8fQcdTmQSz638f8CIt/ZaoN4hBQlq2UZv7UncTohVdq4locN/js9C8SgXu2oMJarsqqB3Vv1SP1FPwo8tYjnpmQ0ZCzQhm9Y4sDJXwhS+8S7qQFqYw5adw1DKFGyshHfpLEU1Gc/REk6GP9VQK8aJ4SYy8TpRXvPbrFrnIw37L/aYTUvqUTDt//q3CfjxKvriv9rOmdrKfztOSU2oeI6d0vbvLx3MspiQfP9aobS39WVo+IgY20Qk3CixJbk5i0qVqF3o3ARITCfsLZChrHM24nl6ZpuBUzJSRsde/sDv5P2hDYtSSbvqUhJtrcGa8N39wUtah27b7QBBqyfrqONZl0gzD4PYBzV5VWixDenJJSNICmoj36vSOvBabOVXSAiJRnfGqz7rcuRlEmY1G6OnR8b9l8dCXBOznOYnA9hGjAnrOPXf5fReiUgpv02nF7ob4pjD/3NK+fLuLSzSsp3TcD6Xqh7N8O5i7bYOpY91NjhO3lnrA3iEkqUrJZlzCExge0kS5MqFGIjxC4BEQ7z2k6sGtg16NP1oJCUB7GT9yI2pjcyuW4rSa4XbSji4Nr+rH6Pw9UCpu9bUKbvqS5FwsvjpzV84XWsRb30kk5AhGD6rgHlwzIci6dlrrwbh9GIz9A7eyjhugrFO3nI1wOePxRi943DRvzOM9lWfeYjaD8sQvEgj4Oiv5VtmjmfHlvZAWK3xZYilkna42IKg6Pyxt9z2W/XvWedR7H+8LVQ/XcBo5e0FF+DrpgvdCRg7UBB+gwtABLqv9qxkXMQfa49B0uPMaZ5TEbyKdJRceQ8roTB+ufN39XFdbujsPjpnJukI+0fKclaG8QllChZLsv4TfwoDGTo3RzlENsXA8NGqtN4KUCvyUEoR3foa58g5WnFfWqwB4vzYyg/PIFRnElh1ofWowqUHtagQm0UMKl6sUwGXfMKpnQEN1537scnVJ/wFZ16sMlB6RW+Kxmt8Oqdg4/wZFQUhna3VCjyIUSSf/6O9gr8DDCUjQi2UR8aUOXWP7GlMSD/gc8pCWDLEcsk7HHWb0H1fgmqjyvCu99UlCy3lTvGXL3bxLRDJUT/XTqo6mnJXztQEr87D9uixH5p78n1X/k3NQeRbTuiHivh4RIYOH/R7kfLnhOi0NuEHf+jtwjjdb/VAxFlCcgpyVobxCVYlFyNoX2XvgAWn3DR+C/ZcLlcEwauBsAKo6cXuKXSK59EYewchMhQ9j7WO+3oA276IU4MvX6UoW0oSmRiGB04lIfmeznkU1SA8miKRwNXIrPuO/4JntILyD/qiEz3yasKFO6V8T1X53zMPuJEQCcmCiNC76cu34+2zQU91GppgF7tY7eRX4PO59lCfeY9qItzgSbQwe8hhIwnqv34RFcPZMSy8Ht/wxM6d4wK5W8mSvSSO9kjTkgUcaMTll+Sg2BB9bWpH4Xov8qTL/4xwHs4R5svQBn7r4U2JMboqwVM3tShWG5LYZZg/xX2TBPx25E42Mzl5ND9QoFvPUCbp8mMIr/vmihecPz5zeP5Q+K8D7RHrI8Yn74PoEUHr5G9+9wH6WgGjevZa4O4eIqS5dYoU9HLMCIp0/B3UeyDuKxEkOH45ZPQYCHDaKaGkGuF2wglbR11amfsRDrsZLs5ByKOKNERgBp0P6ER3JFeS/5uAzofvd9NRirQu/E5MXH+uQO1A/l+JfVws+WR9uLaKYzX+41aMpKHF3mhBg1D8vbSwzEVH4PdWn3EmnjAri29xTDievRNQUwkSSyh0nbmXdzgOKJEe9PocQ8+taGs+lbh4TH0vnhL6zD9d3bWXB3F/mKEo/fqOHV9PLvrHifRf68m0P2tKCb0/GOPdqRHNzxUuzxFfZvQ1ae6euA8qr0G7TdtubTisRuVHM0infzqJRo0WWyDmIRLdI1NGFGi1J9XPolaX5Mn0a2zp6JEhACTSaQjD8G1xfZbDxr3ilA4KEI7sd1JMUSJTtCKetCQCj0m/kRMYYB+EQtEDRq+O2qisqX6iEeKC2/HB35KsCcyYonjT+xbQ96mwbnahj3GESV6yd21qyQA7r/LPBDjuICOZg0F0fIZXpdD6NEzq0xwG7jYkSiRUQ7/5RvlRXuoL/ncj4p5DWsfl2/UuvVmYdc15pRzsD4I0nMZ6NkdUuxZm+xOMrK5KNGGHPVIYznI42cGLLNEA+/L75YUuRQxoXCu+osdmahmO/Y+EbZQH5F3otaPUWxVPMKvYtDxODjqRpNUxBKZYz+3XJG1LdljDFGil8ON22N9uUH992oBC8OEJHeyuHNTyNFsP11bBvrQ9JzAuQ3c7EiUqFC3X6Kr2pVhNC6xNc/yPglWGGaAx5sq1Lp1Eol0dH4LLYOsD4Kf21AmFW/vaDCGzuMaqviwpWM4gGpzUWLekhYOcew6beX7pC7EZii2JtNuFppEzIP6DLqPcAJBzyHqkBFE4vU5J4+L8rnooVte4l0Kf3cy8Q0nwYjl/EMLitivXB6syR4pedxodx7lqO/uhxuLEmXHXttjA7gR/ffnEI7FEvPaTikxH5EgwDHXLj7mYzhBRzN/Rx3gp0oe29u8zMJtYGJnoiRoS7DcleEOec4uOlA9KELzzGda+NgyrvmnlUTWrSlB6/2xWoP0DuMJT5+ytON8loMNRQklg4otaZtGHeTnWk+SOr13AcPnOGAUilBWORsuxPLOto4YT7g+YgDNQ+GwDI235pYRfYE8nGRuYDZIKmJJW3OfUyIh9XGvvLgt2OOmooQSG8V3bRg2JoThBvRfMUnjPTpowZCiAjjmzi5OoS5Oaa27nru23IHnKh4TPLeBkZ2JEmk87pDP/Kwp1GRBJfg4VCb+XKVnBwREQCgEluia/xYR69b5OnS/qbMbopSvIxi878LJ05pKeNTFaxCUnn6y90YaQnhRstrn7yzYFwxhUV+Eh4IerTi7Y8tcTeCEstATXrN1sMv6iCU+9NDPWJGskBHL/G9dmJrsLaBMLgYweHMCzV9LSoyo4ukgbcEeI4sStUxu/75UNlkGyHz/ncPo7xqUCzLCS8UqVKHxtyFBNBLcBn7sTpSgQdLTGQOPmY8MbZ9Mes1/S4gOZOiMcYvXICiSoeS9IYXb6F9c6/JNEiSXjOgP5TDt4mnNu6kPhWO3d+7GviIiliZ7ilk8kw/X7fHN/65x+SYZuP9eP1lrgx2KEuTiGAoJT2b79EA+emrj4H240nvXM143Fo9zxsWSGO10Ia//gQpBxuZ6RQkxO2tAkQSD+j1p5md1KNU9lnS2wLbrM0aBRfkMPKDboadWG2zJWHpoj6br5uJ17P9W7PGaRQnB/ff6yVIb7FaUINPXVe+E1aj8HELr3rYV4h7zvQvVgzwUD6vQ/hTzJv3oQl0sq9n294vf296nCjIMsyJBe5zRwVdkf3ppQT9V/S+2Rma/2bkooTDQ6EUTOl/iLuNMofe06Uo2YhiGYRhmP7kGUcIwDMMwDOOGRQnDMAzDMKmARQnDMAzDMKmARQnDMAzDMKmARQnDMAzDMKmARQnDMAzDMKmARQnDMAzDMKmARQnDMAzDMKmARQnDMAzDMKmARQnDMAzDMKmARQnDMAzDMKmARQnDMAzDMKmARQnDMAzDMKmARQnDMAzDMKmARQnDMAzDMCkA4P9k4UBrilysOQAAAABJRU5ErkJggg==\"\n",
" },\n",
" \"3.PNG\": {\n",
" \"image/png\": \"iVBORw0KGgoAAAANSUhEUgAAAS8AAAAvCAYAAACyqUxsAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAAFiUAABYlAUlSJPAAAA1QSURBVHhe7Z3/ZyPbG8c//2B/GktYwtpRPrHEZaPcKBuXuDSW5rJSbq2rnx9a18rSWFcvlbJCZaksK0uFlVJZy1CGFcLzec63ySSZOXMmM9Nmdp8Xw3Z2mi8z57yfr+f0P0AQBJFDSLwIgsglJF4EQeQSEi+CIHIJiRdBELmExIsgiFxC4kUQRC4h8SIIIpeQeP0AOKM+9C/Nju7pEbQOWkbHUW8i32HzcN43obTbgSSfcHRcgvLrIbjyZyJfRIrX9LIJ1tYWbAUdVgsG7KK7LtSC/p8fBTj8zF+KyIjxm9L8fm/XAoUo/GhC7Rcb7MeW75nJ4/EhDGfyTTYI96oFxSc49hKrjgu9PQuKrwaxBGzytrx6r9Sx0wGHXfT5EApB/8+PGnTv+EsRCTD3vK6PoChvfv085FHP+tCU1xQOuKwR98FsAp0dJT5FaF2tOaunDowujqD2VL0WGp5P8v82hdsOVCz8jh+n8kRC3B40LAtq51xyYuFe1OV9suHoWp5c5rYNZTknqu/ivwcRjrF4uec1+aAq0PkmTy7j88Aa71MaXIQZfFKLe7+V1CuZuTB6U+Uet/WyB5vzJIVIW3u9VEM9598qRhHoDcXUluHrgrjfKgIJwvPAbDj5Is8RqWAsXv19OTG2T2Asz63wsSVDTI3AEZnhvm94Ib71oivClwRMTitgWQ3ofZcnHpgphouFLLzB7wNoPcZo4c+hPGHCBNplOSdQTMME3nlXEdfoBI5YC0PxGsPJtnxQ+315bhUvF6ATOCJDXBi8KopnsGVB5TRpwl3khGphaYJ7xYGzXfxez9qZjK3hn8yLipGL+o7hJr/PW1B+G36fPaOvEThiPczEyxcOVkLj9ikOdPmgNAJHZAzzIp7I54AecPtWnl+X0Rm0/hk9/MT7imExfif7OCOzKMM7Y6H2wkELWh/luRXmRl8ncMR6mInXh6acDLrK4RAO0fVm122Gpf6JuW3zic6fGct/bUjYlwSRc9UJRUJUscnQQ5pXHDXemmf0qeKeBUbiNT625YPS5D+kZaTE5GbA81X8eWzFbgXImsl5A0qP8LM9KsPRJ/xkrEBwKs+hQJX2zmC80KKhvHrDsO5rHw53CqLg8LgCRx/xPdwRdPZK4tx/G9Bd8Ugn0HmO72GUm/JFGboUyZXKAW9O3vBHwkC8HOjsyAdVboc2BU7fN8Q17OFvYG/Qz4cD3Rfzlofmhw2RL2bkWBHAlQlvqwK13QKUD/owYePmDj0g9OAXCw4y/GJ9Z/JMKKzq+qgCJ5/Fb4+PS/geZSiXLai8RZlxZViNr798Rwav2L2qQOerPBEKvoaq7GpSJJ7Rf56smZYIJlq8fL1bRkfAoPjp+Ka80ASHVccJLl9vXVwhBOL1TCZl9rD2AlHVk54OfrZlz1CISAnaN/KEGoNRIjAbwuGTJaFG74d/f5noHxxIQVfNpD5UKNi6kifC8PVumRyF13GqmIQp0eL15QRs/hB0+YZ5vis8oerC8H9NaH/JIPX7bQyTlNzyyUUTmufJ7eTUdcG9S3C46dwn3o0uJ5GFE5Z7Nw8GGycy/+MZxdWQSlXovNypSklE5KOmvQZYSwKnBEkJyOSfGtjbtcBxyEJtdm3zgzwRghdlJO15nE2g+8ch9JL2tATg3EwevsgSxfchnOwdwXBNIx0pXl6filG+CwUuxGpNTqtQfZeN8zw5baXoVUzROpfX71LfQIavVfvEFpSyqtYZ4cDocgQOE1BVrVvxgOb9U16rhxpf6KHpmN4OYHDjf24qNxU+Lv0o8YpqMfG8N6N8V5jHy9pacJyltVJgAfRqX+UkVMVnW2VrVNcwqpHi5fWpmOS7whKq6L2VQl3+MXR2bbCLckA8KoL9ywmM5P8yhn8VxUCwCmg1Gyud0DrxGh6XwH4qkrdsEBeeNqHHPqPThbpaz8fec1ueZ/B2A5aXkT/nnYXlQ1U4y8DSx0UJxYqn7uuf8gTHULxWmKnclFmi30y8fM2pJvmukDwdayhm4XKgdE370Nqerze1HuO/9/2rChw4+02OaTlfFteg6sQLvZ1nvrWsbE7J13bO61CQuTyruPye2cHu+zpGNUK85uGgpVmr6C2TCLREaP1eWtHLhXShAVt/touCFnInoz2vMbSfse+x6D26Fw2wD/rgBqg+G3zFHylXwdemptG4mgbKIwpoIfA8Fp/grCteKuVhmDA3Ei+jnkcXui/ENcGhLhuPBlV5eS8CUzH43aove8KLXcHA87ppQ2nl8+Hn3rOhdd/FHWaw1liepRcvX+I5XHzmlihQ4JwzqBqVioMFhnsNv7e0a/VMwkY1ML3vcduBOn7e0JdlA3/dXRW+4XdW1ah1jzQS9grpeW1Oy4TyiFbHhTKEC9VGnWHToJ75csLcue7DKGCiqOu1OS9vCZxGfHxFrkCB+4Qhs8kqFD6p8XWWr2UV0991oZaBeLFreMFk/gwmp/UHSpcIY1b9N556acVrITEZJg4+SxTUnMqbCwMqO0EIV9ufn2B5gXpkl7hRzss/ASIfPgOvSdBcmDhhHyn2poglQw+frPcR5hGxaiH39IuL912FkhEe1AQ96fJ2GQ55Hmnei7UgRvw9fJVMHybVRi8c1LUEed33weOHCbQukpmjvoNv/jFDVG9BX6sxJuLF5s3coLPCTv0BvXL+WWJ2KmjFywsHdf01npsfbIl4ctPU3ZeD2pLX8yS/wQ01S9grS1MEe0fvyQnE9XlfLcAGhZXK3lfp4RWBljyKCZ5nY2k1XJeipu3zmqc4Sm/wVZXXsmR4HTSmYYvWo/u8fOGgZqIpUQgucom+SdPwXTkQ4nqZ5I/0jszEyzPoT2yo6KKQJLDm4P0KlLaLUGS5uesO1LZtKBar0PE7JagjRn18PsLFazaCIx7G4aFx10d/qe77JvRXLJGwHObrunyDFK1X2TDMMRMvHDZs6xP8rPULs8fEihV5XpMmNu3DCbRhO3yIIpAFllWA5qV4FsxrYjm54Byk8kB0iXfRyFrYbfPc6PhNBSdLGV9z3vflXOGYelqHbuBYkcZN12GPUYYQRE1L0AwFri6uCS5yic8Z1Y7hoSIbFMvRed3ImBuLF0v8s8XuW3hPzKZETBy8F1XPy+XPnc1tpwf1R0sCznvngjQknBXxmq/ZCjhU+OfbmHD1QMvlTRYxIOIkiQevWBUEB3aM8qmZeKHVOrCFl2jonnJLHDdJvCnITvMF67YOMwwpUuo5EygPqQZnn7pQ5xsfWlB41oD2Vfg4Ed6Mfm2je92GGq+isSVGXT5+vKVI/FwHRmHiF7q2cYqTTlbmAg4V/s03Jgw4FgSRpSMMGmE9pKji57eNvSND8WLpE7nxZHiEMYL2ixrUjA80HvI3WSRV8bYZErlxnn/8PoTOQWexv4t7gTqvdxV9wj4x8cVLiGc51m4IZgl7lox0pAU3axfIrXjxJTAJdlT1wbaYXvEyTEOBIFSoomm9CURWx8KboBMSd1eJtYkrXspTjeOVGIiXKoR9k/nE3bPAUDo1uAep6bfbPPESOQLz0EuGBzFj3yjx8icjVQ7BpLKRy7BRVhbjGIxQ+BbJy7nMGKFAAOr+myWs/chqdJb7ed3LxovC8zQOG2WYGS+ZHSVeLAqpS0OjQvJs+//Ec9fc3zTCxrSJlbCXTYUqYW+KTrzc65PFlgiVyI3s/RFeY74S9qKymEpLxF1fhBTLZfo4oUAAIi2w3n3ly39YBS/tnVRllfN++vriJeyVpxrPGOnEy4XR8WJLhLivYe+RIGxED/3s7x4P33nxz9d1MDxuwFlmCfuUiNMqsa7rHipeGGpUyjj5FtQcLc1LNoGCy+VzmHufr+19eGUxaUvEbAqTy0Mo8zxRRJgWFQosw4pAfHO+qHsfhjAo2exhj15Bmi+qwbxVQs6f2C074eI1fluB8vIzVQY9Za+WOy5WHbqOzHMqp+S2A9X6YsWX5zRj9vFlLl7CckS44zf4ZfzLIdjShD3zPdiXxWvwGn9fLQmyClB9px6JC719G4pyYvJlF899lsKPalKVP246vLJYqKM1C+gZizpuhtC/PIOTP2oyua0OvXhHhgIevhaDhQN/N24tgP+hkfT/elDj/T0pFwOflRXRpDq9bIHN8oiy2FB4akPp2HQ0LokXRjSHC8uNqtBRxuOuB81tufxOvk/lTeCMiM308wlUf21AYxdD1OsBHO2UoLZXh+p+8H5tcZ2W7MULJYjtKxW5PCgBJgn7uLBmRN4vlAf4hPaLQkpH0ASLEwpkhGgBSaN3bb2/25gYvnVPll69LmzcQLjnZ2IEF7kH8UIwHCwari9bh9TFK2cLs2P9xeyLbuD5wCNgDU2cUCBLHPT6bBYiy5/X4SH/YjZrlg1dmJ2YfIkXCxkzWJidHrQlzo+BeShA6JHd8rQlTnZb4qRHdpsRpileaW1GSBCRzLLajDAn4pX1ZoR5YHo7Sm0nVYLIP1OYjHKwk2pCfgjxIgji54PEiyCIXELiRRBELiHxIggil5B4EQSRS0i8CILIJSReBEHkEhIvgiByCYkXQRA5BOD/9g455k9Xo4AAAAAASUVORK5CYII=\"\n",
" },\n",
" \"4.PNG\": {\n",
" \"image/png\": \"iVBORw0KGgoAAAANSUhEUgAAAT4AAABBCAYAAABIO8iQAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAAFiUAABYlAUlSJPAAAAtFSURBVHhe7Z3vZyPbH8e//2AejSWUUI0+qEtcNvoglq1LLBvL5kGlbK2v3get68rSWF9dKssqV5aV5UpZZaVUVgnLUCF8vuecOTOZbGbOfGYyaSed94uxkk5+nR/v8/l1zv6HAAAgZ0D4AAC5A8IHAMgdED4AQO6A8AEAcgeEDwCQOyB8AIDcAeEDAOQOCB8AIHdA+AAAuQPCBwDIHRA+AEDugPABAHIHhA8AkDsgfACA3AHhAwDkDggfACB3QPgAALnj/oVvOqHhZY+6px06+6dHvcsRTab6b5LbPvWvJ/oBAACkD0v4xu+rVCgUQq496v7UN5qYjqj3pkpFq0BWqUaNgxa15PWyQsWtOnVvxD03XapvFunw0nkJuE9G1K4E9a9zVd+P1V2Dt8XAv6vreZdsdRfwmPSoKcZ8YHsVLGp9lTfZ1H0e9HfnKr4dqLcC6RHP4vt2RCXVGRY1PvGHuP31iCpPxOue1Kh96UwgP/aXFpU2K1TZlO/doIs7/QfwAIhJWNeTbvuIrvzWuI/Ru4qemDXq3OongYErOlLjW1x1sUCEtGvvtb5no0X9kHvA8sQSPvt8Tw92ppUnGJ3XlVhalUMahGrlmM6e6Q6vtIXtAR6OAR1uOH1hHfT1c4t4lt/2CQ31c8DAzy7tqbkzs54X8Vl+Ly8IAZ/VEUv4vNWIKU7KkpP3b4rVK8JAdC0I02QD98BNmyqyz8TV+BQ29cbU2XXuQX8x+dx05k6hTCff9XML9Kml3eJwcQRpEEP4hnSy7XQKK+Zw06Gq6sQSL2anB0b4ZAP3weRTQ/VDoVChtoy7BjHtUVPdg/7iMjwu63ZtUi/MhfUWHZM4gjTgC5/PVI8c7NMRtZ8691rCZGdFA4V1WChUqfNDPwYPgufCWsJK188t8P2EymosGMQR+PC5sLsdYS8HM1t0DOIIUoEvfJ6pHj3YbdGBlrq3SIf/6icjUJ2+cUjIXz0kvsyuIUPrxXrRXzx8FrLJW/IWHWTHVw5b+DxTPXKwD6n9m9PJcSbG+EOdym8RL3pQ7i6ooSdo+Tg8ZdHft5z+RQCeh2chm7ylEXW0l2Rqe5AOTOGbBbMjB/t1m3Z0J1v7ELK14mtLW+oFan7Wzy0wi/VWT5F/5zCrgzV4S96iY1Hri34OrAye8PlM9ajB7i92RuA7HuZCcd5lvWDGVAOY1eZxLhSac/GqIVgXv1QMJIcnfJ6pHj3Y+/tuB6YV+LZp8GeT2t9XIKK3QxqlVCw9+tik5vmSFtB0QvZPe7krcTNN6OKl7jtDAH5Wy2koNL/pUvPNReh7JOJuQCcvjwy1oFllZiEvWxdp/3tEzXfD9MML9pCGaXXWmvQTS/hmlkjUrgrf5EkpMzU6rVHt/WpcqtFpK8Us8oT6BxXhpqxrWHpWQ8aK7z3tBNdy2uJ9fm9RfxW7b350qPZMfG7ScbXkwjK3p5zLbYeqaj7w4nuh4aEb/dv1w1T50krXvV62n+4BlvB5prrBEnDwpe1DExu+rTsL1y9WorA0d8ImmFgXO8/KVC7pifikROXfT8S7zxj8t+TErKwilbcb1P3ly5uEb3C8Q+Wtoo55WVTcatKFdEHGXapv+D5zWz8vuROTflMsDuuofb7CZU58Lzg7aYuFryQmUfAEn3xuifYqq/3aTpuWqenf+nh7Rnu6ba1SmSrHi58xOq3STsLgv/2pqT4/2VWhk2/6jeLAqYbwJZX2zgMGz1S0+2/hpV5Lt6tJ+MR4b4j3Lsktp/I7ijFf+cuZZYPjsm9+iPfVz0uW6af7gCF8sy1MnMJlz9UNFSwfvr2/i5NNWI+v5J7gCMNerC5qRQ1KuthiQD0TYhgiRNEWn5uhnrd07Y8NKh/0Avdbyux3aQ03lc9cWDHBwvbeRtVyysQWYwubYzUGF+kOj2vivQ3LqxQJa29hEcsqrLrIy0MqqnYNbpPJRUPVw0bMhOTtyrD41HcQ33HOG5gKI6ZSo861fuwn4/0ULXwsU32GVHpnAhk6WmOMF43PqMY6sCBYnGQRdeeFeascx9V1f4/324XLUT/ohycQZDxUWrsJzPzx/2peVjXplTS50T/QVqxJuDzrJVgcB2+EZc1Y5QMnkUBucaxHZoqdcErtwzooH68awksqBYqjs4+dM/cStyvH1Z3qUIg3PmwxZurUCY3jZ7ufIoVvVk1usAT8eOUs0dtuTC60EsVI19rBqTH0lwGITtmvRyZXWDE+v0Up41cvomIX4p6kGc8UkhuJ4lBiKHsBeEMJkme9BIYxpGfgHrMUgeva+b0CuaDsCytaPzShFqN1KPL1ubDh1RC+8FDQb1JWNnPuJW1XjvAJ/Bbl6LQuXmPugSz3U6TwmQd7EBPRQM5riq8NDS7dUEMwXVkg3DpAnXV2J61KiDBqzHjJDTfwXKLyrtmCdHDuD4zVZBWfCxv+vWcB+EDrRXkG4XGoedwkmL7fTYhwm0xM1DjF8Q+G58IaFkLXkhL3BVrLqrYy2ntySNiuTOGT90mLsri1Q3uM+ZXlfjILn/Th3V0Yf8RQbtngW3J1sKh6fLX4OvuKTnYtsp449yxaCU4HVt4xGleh45CykcVgqwgB5HxXnvAJZ+NDTbVB/SOvBaQly//uD4+MWToutsFKF+JYV/eE/Da1+PAz+W6YY+/8irr1msFlCkAlYrK/n/Xqzx31G43VEN+OvF0dQUkl5XHFOKotUbtyhW+q59nm0VwSMZQM91Og8HnxnqDLFKT1I8St88LJ+lgbVd+Jy1UqbuxQ41QIoozD7QYNCse6iLMzwDHDhZjGSKPzhE/GMnT2imm2qwQP11p9KLgnA7uHkgZcc3VpcnVnWyYCN4RglSNdpgXUa7nW5T3jy44vXq7LaqpsmD+SSrmLnEShS5J25QqfLKlRxgozlJPhfopObizL3ZAG//So+7FLPfFv/2rMiEPFFz4nQByvaJqX3JCxjLF2IWp0xgg6roXwpU1c4XN3A73u6SdikGXhS5nYwpekXTnC58a3vzlhpeIbhgOba+FLhBPw5buLOrYRM54QJXz+bJib5OFkqdbN1U0FFc+K4dbouGyiWOiauLppoFzXOKeSJ2nXKOGbq5DQiTA516Laf91c3SwQK7mhA8RxD0UwCZ8tVra5shVVlyQ6PHL1dazVtUpupEGs5Ia2ZJJaA9K6XIfkRhrESm4kbFeT8E2Ft/N6vmxl+LeMXTIOU8hwP2VW+OKUs7jZs7hiEyp8122qVk5oOLdSCavylYxv7FA7qGDTQ4gwo5Tn8RGjnEVb9EknhZrcjILeR0Gccpak7RomfFObevtlWviPxXTJmvXK3AdZ7qfMCp8TH4goYL7uUE1uJ/Jtxym/7PLEUvCr8PXfite729SsItXeu+UFtlj1Ztt2rA1x39N2cGbLLWDWD/OETDCZC5gn1DuQ27/8Wwl36CRWzaMT1siPRe14EOYC5iXb9Vfh+3FGe75talapRT3344UnVPFt5SyJ+5qBezSz3U/ZFT4hX93njC1rS8BJbsRFFlPv/G2a/I8YYXmXVv2/rqmQA2dHz+NhLLwfzpa1xJhc3aRkvJ8yLHwCOZHiZLRikrrwrfMhBakgF6vwQwrSQLpPWd78vhJk/dzmCrOjKxC+rPdTtoVPgGOp1gx3t8AqVvof2T/uaGXgWKpUybzwyfjaqg4iTVP4UjmI9LGAg0hXwsoOIk1T+B7TQaSPlcnNVWonMAOwtoyv6CrVVSr75Fr4AAD5BMIHAMgdED4AQO6A8AEAcgeEDwCQOyB8AIDcAeEDAOQOCB8AIHdA+AAAuQPCBwDIHRA+AEDugPABAHIHhA8AkDsgfACA3AHhAwDkDggfACB3QPgAADmD6P8ryaoSWU7fBQAAAABJRU5ErkJggg==\"\n",
" }\n",
" },\n",
" \"cell_type\": \"markdown\",\n",
" \"metadata\": {},\n",
" \"source\": [\n",
" \"## 3 随机森林的基础知识\\n\",\n",
" \"### 1)信息、熵、信息增益\\n\",\n",
" \"这三个基本概念是决策树的根本,是决策树利用特征来分类时,确定特征选取顺序的依据。理解了它们,决策树你也就了解了大概。\\n\",\n",
" \"\\n\",\n",
" \" 信息\\n\",\n",
" \" \\n\",\n",
" \"  引用香农的话来说,信息是用来消除随机不确定性的东西。当然这句话虽然经典,但是还是很难去搞明白这种东西到底是个什么样,可能在不同的地方来说,指的东西又不一样。对于机器学习中的决策树而言,如果带分类的事物集合可以划分为多个类别当中,则某个类(xi)的信息可以定义如下:\\n\",\n",
" \" ![1.PNG](attachment:1.PNG)\\n\",\n",
" \" \\n\",\n",
" \" I(x)用来表示随机变量的信息,p(xi)指是当xi发生时的概率。\\n\",\n",
" \"\\n\",\n",
" \" 熵\\n\",\n",
" \" \\n\",\n",
" \"  熵是用来度量不确定性的,当熵越大,X=xi的不确定性越大,反之越小。对于机器学习中的分类问题而言,熵越大即这个类别的不确定性更大,反之越小。\\n\",\n",
" \" ![2.PNG](attachment:2.PNG)\\n\",\n",
" \" \\n\",\n",
" \" 条件熵\\n\",\n",
" \" \\n\",\n",
" \" 条件熵是用来解释信息增益而引入的概念,概率定义:随机变量X在给定条件下随机变量Y的条件熵,对定义描述为:X给定条件下Y的条件概率分布的熵对X的数学期望,在机器学习中为选定某个特征后的熵,公式如下:\\n\",\n",
" \" ![3.PNG](attachment:3.PNG)\\n\",\n",
" \" \\n\",\n",
" \" 信息增益\\n\",\n",
" \" \\n\",\n",
" \" 信息增益在决策树算法中是用来选择特征的指标,信息增益越大,则这个特征的选择性越好,在概率中定义为:待分类的集合的熵和选定某个特征的条件熵之差(这里只的是经验熵或经验条件熵,由于真正的熵并不知道,是根据样本计算出来的),公式如下:\\n\",\n",
" \" ![4.PNG](attachment:4.PNG)\\n\",\n",
" \"\\n\",\n",
" \"\\n\",\n",
" \"### 2)决策树\\n\",\n",
" \"\\n\",\n",
" \"  决策树是一种树形结构,其中每个内部节点表示一个属性上的测试,每个分支代表一个测试输出,每个叶节点代表一种类别。常见的决策树算法有C4.5、ID3和CART。\\n\",\n",
" \"\\n\",\n",
" \"### 3)集成学习 \\n\",\n",
" \"\\n\",\n",
" \"  集成学习通过建立几个模型组合的来解决单一预测问题。它的工作原理是生成多个分类器/模型,各自独立地学习和作出预测。这些预测最后结合成单预测,因此优于任何一个单分类的做出预测。\\n\",\n",
" \"  \"\n",
" ]\n",
" },\n",
" {\n",
" \"cell_type\": \"markdown\",\n",
" \"metadata\": {},\n",
" \"source\": [\n",
" \"# 随机森林的生成\\n\",\n",
" \"每棵树的按照如下规则生成:\\n\",\n",
" \"\\n\",\n",
" \"1、用N来表示训练用例(样本)的个数,M表示特征数目。\\n\",\n",
" \"\\n\",\n",
" \"2、输入特征数目m,用于确定决策树上一个节点的决策结果;其中m应远小于M。\\n\",\n",
" \"\\n\",\n",
" \"3、从N个训练用例(样本)中以有放回抽样的方式,取样N次,形成一个训练集(即bootstrap取样),并用未抽到的用例(样本)作预测,评估其误差。\\n\",\n",
" \"\\n\",\n",
" \"4、对于每一个节点,随机选择m个特征,决策树上每个节点的决定都是基于这些特征确定的。根据这m个特征,计算其最佳的分裂方式。\\n\",\n",
" \"\\n\",\n",
" \"5、每棵树都会完整成长而不会剪枝,这有可能在建完一棵正常树状分类器后会被采用)\\n\",\n",
" \" \"\n",
" ]\n",
" },\n",
" {\n",
" \"cell_type\": \"markdown\",\n",
" \"metadata\": {},\n",
" \"source\": [\n",
" \"随机森林构建\\n\",\n",
" \"\\n\",\n",
" \"决策树相当于一个大师,通过自己在数据集中学到的知识对于新的数据进行分类。但是俗话说得好,一个诸葛亮,玩不过三个臭皮匠。随机森林就是希望构建多个臭皮匠,希望最终的分类效果能够超过单个大师的一种算法。\\n\",\n",
" \"\\n\",\n",
" \"那随机森林具体如何构建呢?有两个方面:数据的随机性选取,以及待选特征的随机选取。\\n\",\n",
" \"\\n\",\n",
" \"1.数据的随机选取:\\n\",\n",
" \"首先,从原始的数据集中采取有放回的抽样,构造子数据集,子数据集的数据量是和原始数据集相同的。不同子数据集的元素可以重复,同一个子数据集中的元素也可以重复。第二,利用子数据集来构建子决策树,将这个数据放到每个子决策树中,每个子决策树输出一个结果。最后,如果有了新的数据需要通过随机森林得到分类结果,就可以通过对子决策树的判断结果的投票,得到随机森林的输出结果了。如下图,假设随机森林中有3棵子决策树,2棵子树的分类结果是A类,1棵子树的分类结果是B类,那么随机森林的分类结果就是A类\\n\",\n",
" \"\\n\",\n",
" \"2.待选特征的随机选取\\n\",\n",
" \"与数据集的随机选取类似,随机森林中的子树的每一个分裂过程并未用到所有的待选特征,而是从所有的待选特征中随机选取一定的特征,之后再在随机选取的特征中选取最优的特征。这样能够使得随机森林中的决策树都能够彼此不同,提升系统的多样性,从而提升分类性能。\\n\",\n",
" \" \\n\",\n",
" \"**本实验中随机森林用于回归的具体步骤:**\\n\",\n",
" \"\\n\",\n",
" \"1、从训练集中随机抽取一定数量的样本,作为每棵树的根节点样本;\\n\",\n",
" \"\\n\",\n",
" \"2、在建立决策树时,使用最小方差作为分裂规则,即随机抽取一定数量的候选属性,根据候选属性和对应的特征值划分成两个数据集,两个数据集的方差和越小,该属性越适合作为分裂节点;\\n\",\n",
" \"\\n\",\n",
" \"3、建立好随机森林以后,对于测试样本,进入每一颗决策树进行回归输出,每一颗决策树输出的均值作为最终结果。\"\n",
" ]\n",
" },\n",
" {\n",
" \"cell_type\": \"markdown\",\n",
" \"metadata\": {},\n",
" \"source\": [\n",
" \"## 使用随机森林解决波斯顿房价\\n\",\n",
" \"接下来,我们会使用自己构建的随机森林来预测波斯顿的房价,这是一个回归问题。\\n\",\n",
" \"该数据集是一个回归问题。每个类的观察值数量是均等的,共有 506 个观察,13 个输入变量和1个输出变量。\\n\",\n",
" \"每条数据包含房屋以及房屋周围的详细信息。其中包含城镇犯罪率,一氧化氮浓度,住宅平均房间数,到中心区域的加权距离以及自住房平均房价等等。\\n\",\n",
" \"CRIM:城镇人均犯罪率。\\n\",\n",
" \"ZN:住宅用地超过 25000 sq.ft. 的比例。\\n\",\n",
" \"INDUS:城镇非零售商用土地的比例。\\n\",\n",
" \"CHAS:查理斯河空变量(如果边界是河流,则为1;否则为0)。\\n\",\n",
" \"NOX:一氧化氮浓度。\\n\",\n",
" \"RM:住宅平均房间数。\\n\",\n",
" \"AGE:1940 年之前建成的自用房屋比例。\\n\",\n",
" \"DIS:到波士顿五个中心区域的加权距离。\\n\",\n",
" \"RAD:辐射性公路的接近指数。\\n\",\n",
" \"TAX:每 10000 美元的全值财产税率。\\n\",\n",
" \"PTRATIO:城镇师生比例。\\n\",\n",
" \"B:1000(Bk-0.63)^ 2,其中 Bk 指代城镇中黑人的比例。\\n\",\n",
" \"LSTAT:人口中地位低下者的比例。\\n\",\n",
" \"MEDV:自住房的平均房价,以千美元计。\\n\",\n",
" \"\\n\",\n",
" \"通过完成作业,你将会学到: 1、如何构建回归树; 2、如何利用回归树建立集成模型(随机森林);3、如何评估模型效果。\\n\",\n",
" \"\\n\",\n",
" \"```不要单独创建一个文件,所有的都在这里面编写(在TODO后编写),不要试图改已经有的函数名字 (但可以根据需求自己定义新的函数)```\\n\",\n",
" \"\\n\",\n",
" \"在本次项目中,你将会用到以下几个工具:\\n\",\n",
" \"- ```sklearn```。具体安装请见:http://scikit-learn.org/stable/install.html sklearn包含了各类机器学习算法和数据处理工具,包括本项目需要使用的词袋模型,均可以在sklearn工具包中找得到。 \\n\",\n",
" \"- ```numpy```,数据处理库:www.numpy.org\\n\",\n",
" \"- ```joblib```,这是一个可以简单地将Python代码转换为并行计算模式的软件包,详情见https://pypi.org/project/joblib/\"\n",
" ]\n",
" },\n",
" {\n",
" \"cell_type\": \"markdown\",\n",
" \"metadata\": {},\n",
" \"source\": [\n",
" \"### 1.载入需要的包和数据\"\n",
" ]\n",
" },\n",
" {\n",
" \"cell_type\": \"code\",\n",
" \"execution_count\": 1,\n",
" \"metadata\": {},\n",
" \"outputs\": [],\n",
" \"source\": [\n",
" \"#导入需要用到的算法库\\n\",\n",
" \"import numpy as np\\n\",\n",
" \"from numpy import *\\n\",\n",
" \"import random\\n\",\n",
" \"from sklearn.model_selection import train_test_split\\n\",\n",
" \"from sklearn.datasets import load_boston\\n\",\n",
" \"from sklearn.metrics import r2_score\\n\",\n",
" \"from joblib import Parallel, delayed\\n\",\n",
" \"import warnings \\n\",\n",
" \"warnings.filterwarnings('ignore')\"\n",
" ]\n",
" },\n",
" {\n",
" \"cell_type\": \"code\",\n",
" \"execution_count\": null,\n",
" \"metadata\": {},\n",
" \"outputs\": [],\n",
" \"source\": [\n",
" \"# TODO:导入波斯顿数据并简单探查数据\\n\",\n",
" \"boston = load_boston()\\n\",\n",
" \"\\n\",\n",
" \"\\n\"\n",
" ]\n",
" },\n",
" {\n",
" \"cell_type\": \"code\",\n",
" \"execution_count\": 10,\n",
" \"metadata\": {},\n",
" \"outputs\": [\n",
" {\n",
" \"data\": {\n",
" \"text/plain\": [\n",
" \"'2'\"\n",
" ]\n",
" },\n",
" \"execution_count\": 10,\n",
" \"metadata\": {},\n",
" \"output_type\": \"execute_result\"\n",
" }\n",
" ],\n",
" \"source\": [\n",
" \"label_dict={\\\"1\\\":0,\\\"2\\\":3,\\\"3\\\":2}\\n\",\n",
" \"list(label_dict.keys())[list(label_dict.values()).index(sorted(list(label_dict.values()))[-1])]\"\n",
" ]\n",
" },\n",
" {\n",
" \"cell_type\": \"markdown\",\n",
" \"metadata\": {},\n",
" \"source\": [\n",
" \"### 2.构建自己的随机森林\\n\",\n",
" \"随机森林模型整体是一个myrf的类,所有的参数都包含再类中的成员变量和成员函数中。\\n\",\n",
" \"\\n\",\n",
" \"需要同学通过补全各个模块的代码使随机森林能按照回归树的策略建立起来。\"\n",
" ]\n",
" },\n",
" {\n",
" \"cell_type\": \"code\",\n",
" \"execution_count\": 7,\n",
" \"metadata\": {},\n",
" \"outputs\": [\n",
" {\n",
" \"ename\": \"SyntaxError\",\n",
" \"evalue\": \"invalid syntax (<ipython-input-7-653daea8a3b0>, line 40)\",\n",
" \"output_type\": \"error\",\n",
" \"traceback\": [\n",
" \"\\u001b[1;36m File \\u001b[1;32m\\\"<ipython-input-7-653daea8a3b0>\\\"\\u001b[1;36m, line \\u001b[1;32m40\\u001b[0m\\n\\u001b[1;33m dataSet =\\u001b[0m\\n\\u001b[1;37m ^\\u001b[0m\\n\\u001b[1;31mSyntaxError\\u001b[0m\\u001b[1;31m:\\u001b[0m invalid syntax\\n\"\n",
" ]\n",
" }\n",
" ],\n",
" \"source\": [\n",
" \"class myrf:\\n\",\n",
" \" # 存放树的列表\\n\",\n",
" \" trees = []\\n\",\n",
" \" # 随机种子\\n\",\n",
" \" random_state = 0\\n\",\n",
" \" # 树的个数\\n\",\n",
" \" n_estimators = 10\\n\",\n",
" \" # 最大特征数\\n\",\n",
" \" max_features = 10\\n\",\n",
" \" # 最大深度\\n\",\n",
" \" max_depth = 10\\n\",\n",
" \" # 切分新节点所需的最小阈值\\n\",\n",
" \" min_change = 0.001\\n\",\n",
" \" # 当前树的数量\\n\",\n",
" \" cur_tree = 0\\n\",\n",
" \" # 最小分割\\n\",\n",
" \" min_samples_split = 0\\n\",\n",
" \" # 叶子内节点的最小数目\\n\",\n",
" \" min_samples_leaf = 0\\n\",\n",
" \" # 每次建树时所用的样本占总样本的比例\\n\",\n",
" \" sample_radio = 0.9\\n\",\n",
" \" # 每次建树时所并行化处理器的个数\\n\",\n",
" \" n_jobs = 10\\n\",\n",
" \" # 计算y的方差\\n\",\n",
" \" # 本来是要除总样本数的,考虑到对于所有的叶子来说,总样本数都是一致的,所以不除也可以。\\n\",\n",
" \" def get_varience(self, dataSet):\\n\",\n",
" \" return np.var(dataSet[:,-1])*shape(dataSet)[0]\\n\",\n",
" \" \\n\",\n",
" \" ## TODO:计算y的均值\\n\",\n",
" \" def get_mean(self,dataSet):\\n\",\n",
" \" \\n\",\n",
" \" return\\n\",\n",
" \" \\n\",\n",
" \" # 根据特征和特征值,把样本划分成高于该特征值的部分和低于该特征值的部分\\n\",\n",
" \" def SplitDataSet(self, dataSet,feature,value):\\n\",\n",
" \" ## TODO:将数据集dataSet 按特征feature从小到大排序\\n\",\n",
" \" dataSet = \\n\",\n",
" \" ## TODO:将数据集按特征值划分成高于特征值的部分和低于该特征值两个数据集\\n\",\n",
" \" \\n\",\n",
" \" return\\n\",\n",
" \" \\n\",\n",
" \" # 选取最优的特征和特征值边界\\n\",\n",
" \" def select_best_feature(self, dataSet):\\n\",\n",
" \" #计算特征的数目\\n\",\n",
" \" feature_num=dataSet.shape[1]-1\\n\",\n",
" \" features=np.random.choice(feature_num,self.max_features,replace=False)\\n\",\n",
" \" # 最好分数\\n\",\n",
" \" bestS=inf;\\n\",\n",
" \" # 最优特征\\n\",\n",
" \" bestfeature=0;\\n\",\n",
" \" # 最优特征的分割值\\n\",\n",
" \" bestValue=0;\\n\",\n",
" \" S=self.get_varience(dataSet)\\n\",\n",
" \" ## TODO:判断样本数量是否足够,如果样本量少于最小分割,或者样本量少于叶子内节点的最小数目,就返回数据集的平均值结束程序\\n\",\n",
" \" if :\\n\",\n",
" \" return None,\\n\",\n",
" \" \\n\",\n",
" \" # 遍历所有特征,\\n\",\n",
" \" for feature in features:\\n\",\n",
" \" ## TODO:将数据集按特征feature从小到大排序\\n\",\n",
" \" dataSet = \\n\",\n",
" \" # 遍历数据集中的数据,控制叶子节点数目\\n\",\n",
" \" for index in range(shape(dataSet)[0]-1):\\n\",\n",
" \" ## TODO: 排除dataSet数据集中的重复值\\n\",\n",
" \" if :\\n\",\n",
" \" continue\\n\",\n",
" \" #将数据集按index分为前后两部分\\n\",\n",
" \" data0 = dataSet[0:index+1, :]\\n\",\n",
" \" data1 = dataSet[index+1:, :]\\n\",\n",
" \" #判断样本数量是否足够,如果样本量少于最小分割,或者样本量少于叶子内节点的最小数目,就跳到下一个循环\\n\",\n",
" \" if shape(data0)[0] < self.min_samples_leaf or shape(data1)[0] < self.min_samples_leaf:\\n\",\n",
" \" continue;\\n\",\n",
" \" #将两个数据集分别求取方差并加和作为新的分数\\n\",\n",
" \" newS=self.get_varience(data0)+self.get_varience(data1)\\n\",\n",
" \" #如果最好分数大于新的分数,将新的分数赋值给最好分数,保证方差越小越好\\n\",\n",
" \" if bestS>newS:\\n\",\n",
" \" bestfeature=feature\\n\",\n",
" \" bestValue=dataSet[index][feature]\\n\",\n",
" \"# print(bestfeature, bestValue)\\n\",\n",
" \" bestS=newS\\n\",\n",
" \" #如果误差不大就退出,说明无法分割\\n\",\n",
" \" if (S-bestS)<self.min_change: \\n\",\n",
" \" return None,self.get_mean(dataSet)\\n\",\n",
" \"# print(bestfeature, bestValue)\\n\",\n",
" \" return bestfeature,bestValue\\n\",\n",
" \" \\n\",\n",
" \" # 搭建单颗决策树\\n\",\n",
" \" def createTree(self, dataSet, max_level, flag = 0):\\n\",\n",
" \" if flag == 0:\\n\",\n",
" \" seqtree = self.cur_tree+1\\n\",\n",
" \" self.cur_tree = seqtree;\\n\",\n",
" \" print('正在搭建第',seqtree,'棵树...')\\n\",\n",
" \" #选择最适合的特征和值来构建树\\n\",\n",
" \" bestfeature,bestValue=self.select_best_feature(dataSet)\\n\",\n",
" \" if bestfeature==None:\\n\",\n",
" \" if flag == 0:\\n\",\n",
" \" print('第',seqtree,'棵树搭建完成!')\\n\",\n",
" \" return bestValue\\n\",\n",
" \" retTree={}\\n\",\n",
" \" max_level-=1\\n\",\n",
" \" if max_level<0: #控制深度\\n\",\n",
" \" return self.get_mean(dataSet)\\n\",\n",
" \" retTree['bestFeature']=bestfeature\\n\",\n",
" \" retTree['bestVal']=bestValue\\n\",\n",
" \" ## TODO: 使用self.SplitDataSet将数据集按bestfeature和bestValue分割成左右两棵树数据集\\n\",\n",
" \" lSet,rSet=\\n\",\n",
" \" # 使用self.createTree将左右两棵树数据集构建成树\\n\",\n",
" \" retTree['right']=self.createTree(rSet,self.max_depth,1)\\n\",\n",
" \" retTree['left']=self.createTree(lSet,self.max_depth,1)\\n\",\n",
" \" if flag == 0:\\n\",\n",
" \" print('第',seqtree,'棵树搭建完成!')\\n\",\n",
" \" return retTree\\n\",\n",
" \" \\n\",\n",
" \" \\n\",\n",
" \" # 初始化随机森林\\n\",\n",
" \" def __init__(self, random_state, n_estimators, max_features, max_depth, min_change = 0.001,\\n\",\n",
" \" min_samples_split = 0, min_samples_leaf = 0, sample_radio = 0.9, n_jobs = 10):\\n\",\n",
" \" self.trees = []\\n\",\n",
" \" self.random_state = random_state\\n\",\n",
" \" np.random.seed(self.random_state)\\n\",\n",
" \" self.n_estimators = n_estimators\\n\",\n",
" \" self.max_features = max_features\\n\",\n",
" \" self.max_depth = max_depth\\n\",\n",
" \" self.min_change = min_change\\n\",\n",
" \" self.min_samples_leaf = min_samples_leaf\\n\",\n",
" \" self.min_samples_split = min_samples_split\\n\",\n",
" \" self.sample_radio = sample_radio\\n\",\n",
" \" self.n_jobs = n_jobs\\n\",\n",
" \" \\n\",\n",
" \" # 向森林添加单棵决策树\\n\",\n",
" \" def get_one_tree(self, dataSet):\\n\",\n",
" \" X_train, X_test, y_train, y_test = train_test_split(dataSet[:,:-1], dataSet[:,-1], \\n\",\n",
" \" train_size = self.sample_radio, random_state = self.random_state)\\n\",\n",
" \" X_train=np.concatenate((X_train,y_train.reshape((-1,1))),axis=1)\\n\",\n",
" \" self.trees.append(self.createTree(X_train,self.max_depth))\\n\",\n",
" \" \\n\",\n",
" \" # 并行化搭建随机森林\\n\",\n",
" \" def fit(self, X, Y): #树的个数,预测时使用的特征的数目,树的深度\\n\",\n",
" \" dataSet = np.concatenate((X, Y.reshape(-1,1)), axis = -1)\\n\",\n",
" \" Parallel(n_jobs=self.n_jobs, backend=\\\"threading\\\")(delayed(self.get_one_tree)(dataSet) for _ in range(self.n_estimators)) \\n\",\n",
" \" \\n\",\n",
" \" #预测单个数据样本\\n\",\n",
" \" def treeForecast(self,tree,data):\\n\",\n",
" \" if not isinstance(tree,dict):\\n\",\n",
" \" return float(tree)\\n\",\n",
" \" if data[tree['bestFeature']]>tree['bestVal']:\\n\",\n",
" \" if type(tree['left'])=='float':\\n\",\n",
" \" return tree['left']\\n\",\n",
" \" else:\\n\",\n",
" \" return self.treeForecast(tree['left'],data)\\n\",\n",
" \" else:\\n\",\n",
" \" if type(tree['right'])=='float':\\n\",\n",
" \" return tree['right']\\n\",\n",
" \" else:\\n\",\n",
" \" return self.treeForecast(tree['right'],data) \\n\",\n",
" \" \\n\",\n",
" \" # 单决策树预测结果\\n\",\n",
" \" def createForeCast(self,tree,dataSet):\\n\",\n",
" \" seqtree = self.cur_tree+1\\n\",\n",
" \" self.cur_tree = seqtree;\\n\",\n",
" \" print('第'+str(seqtree)+'棵树正在预测...\\\\n')\\n\",\n",
" \" l=len(dataSet)\\n\",\n",
" \" predict=np.mat(zeros((l,1)))\\n\",\n",
" \" for i in range(l):\\n\",\n",
" \" predict[i,0]=self.treeForecast(tree,dataSet[i,:])\\n\",\n",
" \" print('第'+str(seqtree)+'棵树预测完成!')\\n\",\n",
" \" return predict\\n\",\n",
" \" \\n\",\n",
" \" ## TODO: 使用self.createForestCast函数更新预测值函数\\n\",\n",
" \" def unpdate_predict(self, predict, tree, X):\\n\",\n",
" \" predict+=\\n\",\n",
" \" \\n\",\n",
" \" # 随机森林预测结果\\n\",\n",
" \" def predict(self,X):\\n\",\n",
" \" self.cur_tree = 0;\\n\",\n",
" \" l=len(X)\\n\",\n",
" \" predict=np.mat(zeros((l,1)))\\n\",\n",
" \" Parallel(n_jobs=self.n_jobs, backend=\\\"threading\\\")(delayed(self.unpdate_predict)(predict, tree, X) for tree in self.trees)\\n\",\n",
" \" # 对多棵树预测的结果取平均\\n\",\n",
" \" predict/=self.n_estimators\\n\",\n",
" \" return predict\\n\",\n",
" \" \\n\",\n",
" \" # 获取模型分数\\n\",\n",
" \" def get_score(self,target, X):\\n\",\n",
" \" return r2_score(target, self.predict(X))\"\n",
" ]\n",
" },\n",
" {\n",
" \"cell_type\": \"markdown\",\n",
" \"metadata\": {},\n",
" \"source\": [\n",
" \"### 3.实例化随机森林并对boston数据进行训练、预测\"\n",
" ]\n",
" },\n",
" {\n",
" \"cell_type\": \"code\",\n",
" \"execution_count\": 8,\n",
" \"metadata\": {},\n",
" \"outputs\": [\n",
" {\n",
" \"name\": \"stdout\",\n",
" \"output_type\": \"stream\",\n",
" \"text\": [\n",
" \"正在搭建第1棵树...\\n\",\n",
" \"\\n\",\n",
" \"正在搭建第2棵树...\\n\",\n",
" \"正在搭建第3棵树...\\n\",\n",
" \"\\n\",\n",
" \"\\n\",\n",
" \"正在搭建第4棵树...\\n\",\n",
" \"\\n\",\n",
" \"第2棵树搭建完成!\\n\",\n",
" \"正在搭建第5棵树...\\n\",\n",
" \"\\n\",\n",
" \"第1棵树搭建完成!\\n\",\n",
" \"第3棵树搭建完成!正在搭建第6棵树...\\n\",\n",
" \"\\n\",\n",
" \"\\n\",\n",
" \"正在搭建第7棵树...\\n\",\n",
" \"\\n\",\n",
" \"第4棵树搭建完成!\\n\",\n",
" \"正在搭建第8棵树...\\n\",\n",
" \"\\n\",\n",
" \"第5棵树搭建完成!\\n\",\n",
" \"第7棵树搭建完成!\\n\",\n",
" \"正在搭建第9棵树...\\n\",\n",
" \"\\n\",\n",
" \"第6棵树搭建完成!正在搭建第10棵树...\\n\",\n",
" \"\\n\",\n",
" \"\\n\",\n",
" \"第8棵树搭建完成!\\n\",\n",
" \"第10棵树搭建完成!\\n\",\n",
" \"第9棵树搭建完成!\\n\"\n",
" ]\n",
" }\n",
" ],\n",
" \"source\": [\n",
" \"# rf2 = mycache(random_state=2, n_estimators=10, max_features=3, max_depth=10, min_change=0.001, min_samples_split=20, n_jobs=10)\\n\",\n",
" \"rf1 = myrf(random_state=2, n_estimators=10, max_features=3, max_depth=10, min_change=0.001, min_samples_split=20, n_jobs=-1)\\n\",\n",
" \"## TODO: 使用rf1的fit函数对训练特征数据boston.data和训练目标变量boston.target进行训练\\n\",\n",
" \"\\n\"\n",
" ]\n",
" },\n",
" {\n",
" \"cell_type\": \"code\",\n",
" \"execution_count\": 9,\n",
" \"metadata\": {},\n",
" \"outputs\": [\n",
" {\n",
" \"name\": \"stdout\",\n",
" \"output_type\": \"stream\",\n",
" \"text\": [\n",
" \"第1棵树正在预测...\\n\",\n",
" \"\\n\",\n",
" \"第2棵树正在预测...\\n\",\n",
" \"第1棵树预测完成!\\n\",\n",
" \"第3棵树正在预测...\\n\",\n",
" \"\\n\",\n",
" \"第4棵树正在预测...\\n\",\n",
" \"\\n\",\n",
" \"\\n\",\n",
" \"第5棵树正在预测...\\n\",\n",
" \"\\n\",\n",
" \"第5棵树预测完成!\\n\",\n",
" \"第6棵树正在预测...\\n\",\n",
" \"\\n\",\n",
" \"第3棵树预测完成!\\n\",\n",
" \"第7棵树正在预测...\\n\",\n",
" \"\\n\",\n",
" \"第7棵树预测完成!\\n\",\n",
" \"第8棵树正在预测...\\n\",\n",
" \"\\n\",\n",
" \"第2棵树预测完成!\\n\",\n",
" \"第8棵树预测完成!\\n\",\n",
" \"第4棵树预测完成!\\n\",\n",
" \"第9棵树正在预测...\\n\",\n",
" \"\\n\",\n",
" \"第9棵树预测完成!\\n\",\n",
" \"第10棵树正在预测...\\n\",\n",
" \"\\n\",\n",
" \"第10棵树预测完成!\\n\",\n",
" \"第6棵树预测完成!\\n\"\n",
" ]\n",
" },\n",
" {\n",
" \"data\": {\n",
" \"text/plain\": [\n",
" \"0.9396619095484052\"\n",
" ]\n",
" },\n",
" \"execution_count\": 9,\n",
" \"metadata\": {},\n",
" \"output_type\": \"execute_result\"\n",
" }\n",
" ],\n",
" \"source\": [\n",
" \"## TODO:使用rf1的get_score函数对boston.target和boston.data进行预测并评价结果\\n\",\n",
" \"\\n\"\n",
" ]\n",
" }\n",
" ],\n",
" \"metadata\": {\n",
" \"kernelspec\": {\n",
" \"display_name\": \"Python 3\",\n",
" \"language\": \"python\",\n",
" \"name\": \"python3\"\n",
" },\n",
" \"language_info\": {\n",
" \"codemirror_mode\": {\n",
" \"name\": \"ipython\",\n",
" \"version\": 3\n",
" },\n",
" \"file_extension\": \".py\",\n",
" \"mimetype\": \"text/x-python\",\n",
" \"name\": \"python\",\n",
" \"nbconvert_exporter\": \"python\",\n",
" \"pygments_lexer\": \"ipython3\",\n",
" \"version\": \"3.6.5\"\n",
" }\n",
" },\n",
" \"nbformat\": 4,\n",
" \"nbformat_minor\": 2\n",
"}\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.7.4"
},
"toc": {
"base_numbering": 1,
"nav_menu": {},
"number_sections": false,
"sideBar": true,
"skip_h1_title": false,
"title_cell": "Table of Contents",
"title_sidebar": "Contents",
"toc_cell": false,
"toc_position": {},
"toc_section_display": true,
"toc_window_display": false
},
"varInspector": {
"cols": {
"lenName": 16,
"lenType": 16,
"lenVar": 40
},
"kernels_config": {
"python": {
"delete_cmd_postfix": "",
"delete_cmd_prefix": "del ",
"library": "var_list.py",
"varRefreshCmd": "print(var_dic_list())"
},
"r": {
"delete_cmd_postfix": ") ",
"delete_cmd_prefix": "rm(",
"library": "var_list.r",
"varRefreshCmd": "cat(var_dic_list()) "
}
},
"types_to_exclude": [
"module",
"function",
"builtin_function_or_method",
"instance",
"_Feature"
],
"window_display": false
}
},
"nbformat": 4,
"nbformat_minor": 2
}
This source diff could not be displayed because it is too large. You can view the blob instead.
This source diff could not be displayed because it is too large. You can view the blob instead.
CustomerID,Gender,Age,Annual Income (k$),Spending Score (1-100)
1,Male,19,15,39
2,Male,21,15,81
3,Female,20,16,6
4,Female,23,16,77
5,Female,31,17,40
6,Female,22,17,76
7,Female,35,18,6
8,Female,23,18,94
9,Male,64,19,3
10,Female,30,19,72
11,Male,67,19,14
12,Female,35,19,99
13,Female,58,20,15
14,Female,24,20,77
15,Male,37,20,13
16,Male,22,20,79
17,Female,35,21,35
18,Male,20,21,66
19,Male,52,23,29
20,Female,35,23,98
21,Male,35,24,35
22,Male,25,24,73
23,Female,46,25,5
24,Male,31,25,73
25,Female,54,28,14
26,Male,29,28,82
27,Female,45,28,32
28,Male,35,28,61
29,Female,40,29,31
30,Female,23,29,87
31,Male,60,30,4
32,Female,21,30,73
33,Male,53,33,4
34,Male,18,33,92
35,Female,49,33,14
36,Female,21,33,81
37,Female,42,34,17
38,Female,30,34,73
39,Female,36,37,26
40,Female,20,37,75
41,Female,65,38,35
42,Male,24,38,92
43,Male,48,39,36
44,Female,31,39,61
45,Female,49,39,28
46,Female,24,39,65
47,Female,50,40,55
48,Female,27,40,47
49,Female,29,40,42
50,Female,31,40,42
51,Female,49,42,52
52,Male,33,42,60
53,Female,31,43,54
54,Male,59,43,60
55,Female,50,43,45
56,Male,47,43,41
57,Female,51,44,50
58,Male,69,44,46
59,Female,27,46,51
60,Male,53,46,46
61,Male,70,46,56
62,Male,19,46,55
63,Female,67,47,52
64,Female,54,47,59
65,Male,63,48,51
66,Male,18,48,59
67,Female,43,48,50
68,Female,68,48,48
69,Male,19,48,59
70,Female,32,48,47
71,Male,70,49,55
72,Female,47,49,42
73,Female,60,50,49
74,Female,60,50,56
75,Male,59,54,47
76,Male,26,54,54
77,Female,45,54,53
78,Male,40,54,48
79,Female,23,54,52
80,Female,49,54,42
81,Male,57,54,51
82,Male,38,54,55
83,Male,67,54,41
84,Female,46,54,44
85,Female,21,54,57
86,Male,48,54,46
87,Female,55,57,58
88,Female,22,57,55
89,Female,34,58,60
90,Female,50,58,46
91,Female,68,59,55
92,Male,18,59,41
93,Male,48,60,49
94,Female,40,60,40
95,Female,32,60,42
96,Male,24,60,52
97,Female,47,60,47
98,Female,27,60,50
99,Male,48,61,42
100,Male,20,61,49
101,Female,23,62,41
102,Female,49,62,48
103,Male,67,62,59
104,Male,26,62,55
105,Male,49,62,56
106,Female,21,62,42
107,Female,66,63,50
108,Male,54,63,46
109,Male,68,63,43
110,Male,66,63,48
111,Male,65,63,52
112,Female,19,63,54
113,Female,38,64,42
114,Male,19,64,46
115,Female,18,65,48
116,Female,19,65,50
117,Female,63,65,43
118,Female,49,65,59
119,Female,51,67,43
120,Female,50,67,57
121,Male,27,67,56
122,Female,38,67,40
123,Female,40,69,58
124,Male,39,69,91
125,Female,23,70,29
126,Female,31,70,77
127,Male,43,71,35
128,Male,40,71,95
129,Male,59,71,11
130,Male,38,71,75
131,Male,47,71,9
132,Male,39,71,75
133,Female,25,72,34
134,Female,31,72,71
135,Male,20,73,5
136,Female,29,73,88
137,Female,44,73,7
138,Male,32,73,73
139,Male,19,74,10
140,Female,35,74,72
141,Female,57,75,5
142,Male,32,75,93
143,Female,28,76,40
144,Female,32,76,87
145,Male,25,77,12
146,Male,28,77,97
147,Male,48,77,36
148,Female,32,77,74
149,Female,34,78,22
150,Male,34,78,90
151,Male,43,78,17
152,Male,39,78,88
153,Female,44,78,20
154,Female,38,78,76
155,Female,47,78,16
156,Female,27,78,89
157,Male,37,78,1
158,Female,30,78,78
159,Male,34,78,1
160,Female,30,78,73
161,Female,56,79,35
162,Female,29,79,83
163,Male,19,81,5
164,Female,31,81,93
165,Male,50,85,26
166,Female,36,85,75
167,Male,42,86,20
168,Female,33,86,95
169,Female,36,87,27
170,Male,32,87,63
171,Male,40,87,13
172,Male,28,87,75
173,Male,36,87,10
174,Male,36,87,92
175,Female,52,88,13
176,Female,30,88,86
177,Male,58,88,15
178,Male,27,88,69
179,Male,59,93,14
180,Male,35,93,90
181,Female,37,97,32
182,Female,32,97,86
183,Male,46,98,15
184,Female,29,98,88
185,Female,41,99,39
186,Male,30,99,97
187,Female,54,101,24
188,Male,28,101,68
189,Female,41,103,17
190,Female,36,103,85
191,Female,34,103,23
192,Female,32,103,69
193,Male,33,113,8
194,Female,38,113,91
195,Female,47,120,16
196,Female,35,120,79
197,Female,45,126,28
198,Male,32,126,74
199,Male,32,137,18
200,Male,30,137,83
# Biplot
import matplotlib.pyplot as plt
import numpy as np
def biplot(score, coeff, labels=None):
xs = score[:,0]
ys = score[:,1]
n = coeff.shape[0]
scalex = 1.0/(xs.max()- xs.min())
scaley = 1.0/(ys.max()- ys.min())
plt.scatter(xs*scalex,ys*scaley, color="#c7e9c0", edgecolor="#006d2c", alpha=0.5)
for i in range(n):
plt.arrow(0, 0, coeff[i,0], coeff[i,1],color='#253494',alpha=0.5,lw=2)
if labels is None:
plt.text(coeff[i,0]* 1.15, coeff[i,1] * 1.15, "Var"+str(i+1), color="#000000", ha="center", va="center")
else:
plt.text(coeff[i,0]* 1.15, coeff[i,1] * 1.15, labels[i], color="#000000", ha="center", va="center")
plt.xlim(-.75,1)
plt.ylim(-0.5,1)
plt.grid(False)
plt.xticks(np.arange(0, 1, 0.5), size=12)
plt.yticks(np.arange(-0.75, 1, 0.5), size=12)
plt.xlabel("Component 1", size=14)
plt.ylabel("Component 2", size=14)
plt.gca().spines["top"].set_visible(False);
plt.gca().spines["right"].set_visible(False);
\ No newline at end of file
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment