- Tytuł:
- Interpretable decision-tree induction in a big data parallel framework
- Autorzy:
-
Weinberg, A. I.
Last, M. - Powiązania:
- https://bibliotekanauki.pl/articles/330635.pdf
- Data publikacji:
- 2017
- Wydawca:
- Uniwersytet Zielonogórski. Oficyna Wydawnicza
- Tematy:
-
big data
parallel computing
mapreduce
decision trees
editing distance
tree similarity
zbiór danych
obliczenia równoległe
drzewa decyzyjne
odległość edycji - Opis:
- When running data-mining algorithms on big data platforms, a parallel, distributed framework, such as MAPREDUCE, may be used. However, in a parallel framework, each individual model fits the data allocated to its own computing node without necessarily fitting the entire dataset. In order to induce a single consistent model, ensemble algorithms such as majority voting, aggregate the local models, rather than analyzing the entire dataset directly. Our goal is to develop an efficient algorithm for choosing one representative model from multiple, locally induced decision-tree models. The proposed SySM (syntactic similarity method) algorithm computes the similarity between the models produced by parallel nodes and chooses the model which is most similar to others as the best representative of the entire dataset. In 18.75% of 48 experiments on four big datasets, SySM accuracy is significantly higher than that of the ensemble; in about 43.75% of the experiments, SySM accuracy is significantly lower; in one case, the results are identical; and in the remaining 35.41% of cases the difference is not statistically significant. Compared with ensemble methods, the representative tree models selected by the proposed methodology are more compact and interpretable, their induction consumes less memory, and, as confirmed by the empirical results, they allow faster classification of new records.
- Źródło:
-
International Journal of Applied Mathematics and Computer Science; 2017, 27, 4; 737-748
1641-876X
2083-8492 - Pojawia się w:
- International Journal of Applied Mathematics and Computer Science
- Dostawca treści:
- Biblioteka Nauki