Informacja

Drogi użytkowniku, aplikacja do prawidłowego działania wymaga obsługi JavaScript. Proszę włącz obsługę JavaScript w Twojej przeglądarce.

Wyszukujesz frazę "data quality" wg kryterium: Temat


Tytuł:
Geometric and semantic quality assessments of building features in OpenStreetMap for some areas of Istanbul
Autorzy:
Basaraner, Melih
Powiązania:
https://bibliotekanauki.pl/articles/1191768.pdf
Data publikacji:
2020
Wydawca:
Oddział Kartograficzny Polskiego Towarzystwa Geograficznego
Tematy:
OpenStreetMap
building features
geometric data quality
semantic data quality
topographic data
Opis:
Nowadays volunteered geographic information (VGI) and collaborative mapping projects such as OpenStreetMap (OSM) have gained popularity as they not only offer free data but also allow crowdsourced contributions. Spatial data entry in this manner creates quality concerns for further use of the VGI data. In this regard, this article focuses on the assessments of geometric and semantic quality of the OSM building features (BFs) against a large-scale topographic (TOPO) data belonging to some areas of Istanbul. The comparison is carried out based on the one-to-one matched BFs according to a geometric matching ratio. In geometric terms, various parameters of position (i.e. X, Y), size (i.e. area, perimeter and granularity), shape (i.e. convexity, circularity, elongation, equivalent rectangular index, rectangularity and roughness index), and orientation (i.e. orientation angle) elements are computed and compared. In semantic terms, BF type coherences are evaluated. According to the findings of geometric quality, the average positional difference was less than three meters. In addition, the perimeter values tended to decrease while area and granularity values tended to increase in OSM data against TOPO data. Those showed that the level of the detail of the OSM BFs was lower than TOPO BFs in general. This was also confirmed by the decreasing tendency of shape complexity according to the parameters of shape element. Orientation angle differences was often low except for some special cases. It was found that the scale of the OSM dataset, even though not homogenous, approximately corresponded to the lower limit of medium scale maps (i.e. 1:10,000) or a slightly smaller scale. According to the findings of semantic quality, in case of the presence of specific type definition, the coherence was rather high between OSM and TOPO BFs while the most OSM BFs did not have a specific type attribute. This study showed that the matching process needed some improvements while the followed approach was largely successful in the evaluation of the matched buildings from geometric and semantic aspects.
Źródło:
Polish Cartographical Review; 2020, 52, 3; 94-107
2450-6974
Pojawia się w:
Polish Cartographical Review
Dostawca treści:
Biblioteka Nauki
Artykuł
Tytuł:
An algorithm for data quality assessment in predictive toxicology
Autorzy:
Malazizi, L.
Neagu, D.
Chaudhry, Q.
Powiązania:
https://bibliotekanauki.pl/articles/1943267.pdf
Data publikacji:
2007
Wydawca:
Politechnika Gdańska
Tematy:
QSAR models
data quality
data cleaning
Opis:
Lack of the quality of the information that is integrated from heterogeneous sources is an important issue in many scientific domains. In toxicology the importance is even greater since the data is used for Quantitative Structure Activity Relationship (QSAR) modeling for prediction of chemical toxicity of new compounds. Much work has been done on QSARs but little attention has been paid to the quality of the data used. The underlying concept points to the absence of the quality criteria framework in this domain. This paper presents a review on some of the existing data quality assessment methods in various domains and their relevance and possible application to predictive toxicology, highlights number of data quality deficiencies from experimental work on internal data and also proposes some quality metrics and an algorithm for assessing data quality concluded from the results.
Źródło:
TASK Quarterly. Scientific Bulletin of Academic Computer Centre in Gdansk; 2007, 11, 1-2; 103-115
1428-6394
Pojawia się w:
TASK Quarterly. Scientific Bulletin of Academic Computer Centre in Gdansk
Dostawca treści:
Biblioteka Nauki
Artykuł
Tytuł:
The method for the quality evaluation of open geospatial data for creation and updating of datasets for National Spatial Data Infrastructure in Ukraine
Autorzy:
Kin, Danylo
Lazorenko-Hevel, Nadiia
Powiązania:
https://bibliotekanauki.pl/articles/1505977.pdf
Data publikacji:
2021
Wydawca:
Oddział Kartograficzny Polskiego Towarzystwa Geograficznego
Tematy:
open data
data quality
GKI
NSDI
RDF
Opis:
The purpose of the article is to present the research on method of the quality evaluation of published open geospatial data and its implementation in Ukraine. The method of the quality evaluation of open geospatial data considers the international standard ISO 19157 “Geographic information. Data quality”. This method is to determine the number of points or levels (maximum - 5). The research was carried out for the evaluation of open geoinformation resources for production of geospatial datasets, as denfied in the Ukrainian Law on NSDI. The authors evaluated the quality of 142 open geoinformation resources and other information resources (materials) for the production and updating of 34 geospatial datasets for the development of NSDI in Ukraine. The authors present the example of the quality evaluation of geospatial data for datasets: “State Geodetic Reference Coordinate System UCS-2000”, “State Geodetic Network”, “Geographical Names” and “Administrative Units” because they are the components of the Core Reference Dataset of NSDI. Limitations of the research were determined by the adopted the Law of Ukraine “On National Spatial Data Infrastructure” and the Order for NSDI functioning in Ukraine and the requirements of the international standard ISO 19157 “Geographic information. Data quality”. The results of the research will be employed to evaluate the quality of NSDI implementation in Ukraine. The proposed method allows evaluating the quality of open geospatial dataset before using them for analysis and modeling of terrain, phenomena. This method takes into account the quality of geospatial data, and its related requirements for their production, updating and publication.
Źródło:
Polish Cartographical Review; 2021, 53; 13-20
2450-6974
Pojawia się w:
Polish Cartographical Review
Dostawca treści:
Biblioteka Nauki
Artykuł
Tytuł:
Knowledge mining from data: methodological problems and directions for development
Autorzy:
Kulikowski, J.
Powiązania:
https://bibliotekanauki.pl/articles/1934003.pdf
Data publikacji:
2011
Wydawca:
Politechnika Gdańska
Tematy:
data mining
knowledge discovery
data quality
CODATA
Opis:
The development of knowledge engineering and, within its framework, of data mining or knowledge mining from data should result in the characteristics or descriptions of objects, events, processes and/or rules governing them, which should satisfy certain quality criteria: credibility, accuracy, verifiability, topicality, mutual logical consistency, usefulness, etc. Choosing suitable mathematical models of knowledge mining from data ensures satisfying only some of the above criteria. This paper presents, also in the context of the aims of The Committee on Data for Science and Technology (CODATA), more general aspects of knowledge mining and popularization, which require applying the rules that enable or facilitate controlling the quality of data.
Źródło:
TASK Quarterly. Scientific Bulletin of Academic Computer Centre in Gdansk; 2011, 15, 2; 227-233
1428-6394
Pojawia się w:
TASK Quarterly. Scientific Bulletin of Academic Computer Centre in Gdansk
Dostawca treści:
Biblioteka Nauki
Artykuł
Tytuł:
Processing of 3D Weather Radar Data with Application for Assimilation in the NWP Model
Autorzy:
Ośródka, Katarzyna
Szturc, Jan
Jakubiak, Bogumił
Jurczyk, Anna
Powiązania:
https://bibliotekanauki.pl/articles/2037641.pdf
Data publikacji:
2014-10-10
Wydawca:
Uniwersytet Warszawski. Wydział Geografii i Studiów Regionalnych
Tematy:
Weather radar
radar refectivity
data quality
data assimilation
Opis:
The paper is focused on the processing of 3D weather radar data to minimize the impact of a number of errors from different sources, both meteorological and non-meteorological. The data is also quantitatively characterized in terms of its quality. A set of dedicated algorithms based on analysis of the reflectivity field pattern is described. All the developed algorithms were tested on data from the Polish radar network POLRAD. Quality control plays a key role in avoiding the introduction of incorrect information into applications using radar data. One of the quality control methods is radar data assimilation in numerical weather prediction models to estimate initial conditions of the atmosphere. The study shows an experiment with quality controlled radar data assimilation in the COAMPS model using the ensemble Kalman filter technique. The analysis proved the potential of radar data for such applications; however, further investigations will be indispensable.
Źródło:
Miscellanea Geographica. Regional Studies on Development; 2014, 18, 3; 31-39
0867-6046
2084-6118
Pojawia się w:
Miscellanea Geographica. Regional Studies on Development
Dostawca treści:
Biblioteka Nauki
Artykuł
Tytuł:
Methodology of assessing quality of spatial data describing course of shoreline as tool supporting water resource management process
Autorzy:
Kwartnik-Pruc, Anita
Mączyńska, Aneta
Powiązania:
https://bibliotekanauki.pl/articles/27312658.pdf
Data publikacji:
2023
Wydawca:
Instytut Technologiczno-Przyrodniczy
Tematy:
data quality
data validity
shoreline
water resource management
Opis:
The proper management of water resources is currently an important issue, not only in Poland, but also worldwide. Water resource management involves various activities including monitoring, modelling, assessment and designing the condition and extent of waters sources. The efficient management of water resources is essential, especially in rural areas where it ensures greater stability and efficiency of production in all sectors of the economy and leads to the well-being of the ecosystem. The performed analyses have demonstrated that the time of origin of the cadastral data defining the course of water boundaries has a significant effect on their quality. Having analysed the factors (timeliness, completeness, redundancy) used to assess the quality of cadastral data, their clear trend of changes in time was noticed. Thus, it is possible to specify the estimated degree of quality of cadastral data defining the course of watercourse boundaries only based on the information about the method, time and area of data origin in the context of the former partition sector. This research paper presents an original method of assessing the quality of spatial data that is used to determine the course of the shoreline of natural watercourses with unregulated channels flowing through agricultural land. The research has also demonstrated that in order to increase the efficiency of work, the smallest number of principal factors should be selected for the final analysis. Limiting the analyses to a smaller number of factors does not affect the final result, yet it definitely reduces the amount of work.
Źródło:
Journal of Water and Land Development; 2023, 57; 167--180
1429-7426
2083-4535
Pojawia się w:
Journal of Water and Land Development
Dostawca treści:
Biblioteka Nauki
Artykuł
Tytuł:
Neonatal Mortality by Gestational Age and Birth Weight in Italy, 1998-2003: A Record Linkage Study
Autorzy:
Marini, Cristiano
Nuccitelli, Alessandra
Powiązania:
https://bibliotekanauki.pl/articles/465989.pdf
Data publikacji:
2011
Wydawca:
Główny Urząd Statystyczny
Tematy:
administrative data
Boolean linear programming
data quality
greedy algorithm
Opis:
Neonatal mortality rates by gestational age and birth weight category are important indicators of maternal and child health and care quality. However, due to recent laws on administrative simplification and privacy, these specific rates have not been calculated in Italy since 1999. The main aim of this work is to assess the possibility of retrieving information on neonatal mortality by the linkage between records related to live births and records related to infant deaths within the first month of life, with reference to 2003 and 2004 birth cohorts. From a strict methodological point of view, some critical aspects of the most used record linkage approach are highlighted: specific problems may arise from the choice of records to be linked if there are consistency constraints between pairs (in this context, one death record can be linked to at most one birth record). In the light of considerations on the quality of the starting data, the retrieval of information on neonatal mortality by gestational age and birth weight is restricted to Northern Italy. Specific neonatal mortality rates are provided with reference to 2003 and discussed with particular emphasis on quality issues in the data collection processes.
Źródło:
Statistics in Transition new series; 2011, 12, 1; 207-222
1234-7655
Pojawia się w:
Statistics in Transition new series
Dostawca treści:
Biblioteka Nauki
Artykuł
Tytuł:
ICA and ICS-based rankings of EU countries according to quality of mirror data on intra-Community trade in goods in the years 2014-2017
Autorzy:
Markowicz, Iwona
Baran, Paweł
Powiązania:
https://bibliotekanauki.pl/articles/19090893.pdf
Data publikacji:
2019
Wydawca:
Instytut Badań Gospodarczych
Tematy:
official statistics data quality
mirror data
intra-Community trade
EU
Opis:
Research background: As a system of official EU statistics, Intrastat contains data collected by Member States aggregated by Eurostat on the Union's level in the form of COMEXT database. Country-level data are based on declarations made by businesses dispatching or acquiring goods from other EU Member States. Since the same transaction is declared twice - as an ICS in one country and at the same time as an ICA in another country by the partner - the database contains mirror data. Analysis of mirror data lets us assess the quality of public statistics data on international trade. Purpose of the article: The aim of the article is to rank EU Member States according to quality of data on intra-Community trade in goods collected by Intrastat. Foreign trade stimulates economic development on one hand and is the development's reflection on the other. Thus it is very important that official statistics in this area be of good quality. Analysis of mirror data from partner states in intra-Community trade in goods allows us to claim that not every Member State pro-vides data of satisfactory quality level. Methods: We used the authors' methodology of assessing quality of mirror data. These include data asymmetry indices, both proposed by Eurostat and the authors' own proposals. We have also examined the changes in the above mentioned rankings over time. Findings & Value added: The result of the survey is ordering of EU Member States according to the quality of data on intra-Community trade in goods. The rankings are presented for the period of 2014-2017, during which there were 28 Member States of the EU. Changes in distinct countries' positions were shown as a result of changes in overall quality of statistical data collected in these countries. The research methodology can be used in the process of monitoring data quality of the Intrastat system.
Źródło:
Oeconomia Copernicana; 2019, 10, 1; 55-68
2083-1277
Pojawia się w:
Oeconomia Copernicana
Dostawca treści:
Biblioteka Nauki
Artykuł
Tytuł:
Application of ARIMA Models for the Analysis of Utilization Process of Military Technical Objects
Autorzy:
Borucka, Anna
Powiązania:
https://bibliotekanauki.pl/articles/503736.pdf
Data publikacji:
2018
Wydawca:
Międzynarodowa Wyższa Szkoła Logistyki i Transportu
Tematy:
aircraft
forecasting
ARIMA models
military data analysis
data quality issue
Opis:
The newest solutions in Polish Armed Forces are implemented gradually and focus mainly on soldiers’ combat readiness. Many concurrent processes occur, for which proper analysis and interpretation could constitute command process and task realization support; however poor and standing (paper) record seems to be an obstacle in their modelling. Therefore the author of the article tried to depict the process of military technical objects exploitation based on archived data according to present methods of documents preparation, circuit and record, applicable in Polish Armed Forces. Based on that, the method of research the readiness of aircraft ships from military air base, powered by ARIMA model, was proposed. Using empirical data of two years of exploitation, the identification of researched time series, and then a few models estimation was made. Finally, the best model was chosen and verified.
Źródło:
Logistics and Transport; 2018, 37, 1; 13-22
1734-2015
Pojawia się w:
Logistics and Transport
Dostawca treści:
Biblioteka Nauki
Artykuł
Tytuł:
Microarray Inspector: tissue cross contamination detection tool for microarray data
Autorzy:
Stępniak, Piotr
Maycock, Matthew
Wojdan, Konrad
Markowska, Monika
Perun, Serhiy
Srivastava, Aashish
Wyrwicz, Lucjan
Świrski, Konrad
Powiązania:
https://bibliotekanauki.pl/articles/1039460.pdf
Data publikacji:
2013
Wydawca:
Polskie Towarzystwo Biochemiczne
Tematy:
microarray
transcription profiling
contamination analysis
adipose tissue
cancer
data quality
Opis:
Microarray technology changed the landscape of contemporary life sciences by providing vast amounts of expression data. Researchers are building up repositories of experiment results with various conditions and samples which serve the scientific community as a precious resource. Ensuring that the sample is of high quality is of utmost importance to this effort. The task is complicated by the fact that in many cases datasets lack information concerning pre-experimental quality assessment. Transcription profiling of tissue samples may be invalidated by an error caused by heterogeneity of the material. The risk of tissue cross contamination is especially high in oncological studies, where it is often difficult to extract the sample. Therefore, there is a need of developing a method detecting tissue contamination in a post-experimental phase. We propose Microarray Inspector: customizable, user-friendly software that enables easy detection of samples containing mixed tissue types. The advantage of the tool is that it uses raw expression data files and analyses each array independently. In addition, the system allows the user to adjust the criteria of the analysis to conform to individual needs and research requirements. The final output of the program contains comfortable to read reports about tissue contamination assessment with detailed information about the test parameters and results. Microarray Inspector provides a list of contaminant biomarkers needed in the analysis of adipose tissue contamination. Using real data (datasets from public repositories) and our tool, we confirmed high specificity of the software in detecting contamination. The results indicated the presence of adipose tissue admixture in a range from approximately 4% to 13% in several tested surgical samples.
Źródło:
Acta Biochimica Polonica; 2013, 60, 4; 647-655
0001-527X
Pojawia się w:
Acta Biochimica Polonica
Dostawca treści:
Biblioteka Nauki
Artykuł
Tytuł:
Experimental evaluation of the accuracy parameters of former surveying networks
Autorzy:
Ślusarski, M.
Justyniak, N.
Powiązania:
https://bibliotekanauki.pl/articles/101101.pdf
Data publikacji:
2017
Wydawca:
Polska Akademia Nauk. Stowarzyszenie Infrastruktura i Ekologia Terenów Wiejskich PAN
Tematy:
modernization of land and building registry
cadastre
spatial data quality
Opis:
Surveying field measurements performed during the development and updating of land registry were based directly on measuring networks. Accuracy parameters of these networks are lower in comparison with modern measurement networks. The main reasons for the lower accuracy relate mainly to the lower precision of measurements, two orders of network and approximate techniques for determining the coordinates of the points of these networks. Currently, archival materials of the State Geodetic and Cartographic Resource are used during surveying of real estate (division, separation, modernization of land registry). The paper presents the results of experimental evaluation of the accuracy parameters of former networks. The purpose of the performed evaluation to was to analyze the possibilities of use archival materials of the State Geodetic and Cartographic Resource during surveying works related to real estate. The study was carried out on the basis of three test objects located in the Małopolskie voivodship. Points of the analyzed networks found on the ground were measured (approx. 34% of all points), and then their coordinates were determined. The representative research sample was constituted by sets of deviations Z - the length of the displacement vector of the point in the research sample in relation to its position considered as error free. The basic estimator of the accuracy of analyzed networks was the root mean square error RMSE. This estimator was determined on the basis of a vector Z* originating from the set, from which the outlying observations were removed. Values of calculated RMSE show, that for the analyz ed objects the required accuracy is met by no more than 16% of the points of the networks (RMSE ≤ 0.20 m). Therefore, it may be concluded, that for more than 80% of the border points the average position errors exceed the threshold size: 0.30 m relative to the 1st class geodetic network. Hence the conclusion that during surveying real estate, archival materials of the State Geodetic and Cartographic Resource should be used in a limited scope.
Źródło:
Infrastruktura i Ekologia Terenów Wiejskich; 2017, II/2; 825-835
1732-5587
Pojawia się w:
Infrastruktura i Ekologia Terenów Wiejskich
Dostawca treści:
Biblioteka Nauki
Artykuł
Tytuł:
The role of big data in Industry 4.0 in mining industry in Serbia
Autorzy:
Tylečková, Eva
Noskievičová, Darja
Powiązania:
https://bibliotekanauki.pl/articles/88570.pdf
Data publikacji:
2020
Wydawca:
Stowarzyszenie Menedżerów Jakości i Produkcji
Tematy:
big data
Industry 4.0
data quality
data security
duże dane
jakość danych
ochrona danych
Opis:
The current age characterized by unstoppable progress and rapid development of new technologies and methods such as the Internet of Things, machine learning and artificial intelligence, brings new requirements for enterprise information systems. Information systems ought to be a consistent set of elements that provide a basis for information that could be used in context to obtain knowledge. To generate valid knowledge, information must be based on objective and actual data. Furthermore, due to Industry 4.0 trends such as digitalization and online process monitoring, the amount of data produced is constantly increasing – in this context the term Big Data is used. The aim of this article is to point out the role of Big Data within Industry 4.0. Nevertheless, Big Data could be used in a much wider range of business areas, not just in industry. The term Big Data encompasses issues related to the exponentially growing volume of produced data, their variety and velocity of their origin. These characteristics of Big Data are also associated with possible processing problems. The article also focuses on the issue of ensuring and monitoring the quality of data. Reliable information cannot be inferred from poor quality data and the knowledge gained from such information is inaccurate. The expected results do not appear in such a case and the ultimate consequence may be a loss of confidence in the information system used. On the contrary, it could be assumed that the acquisition, storage and use of Big Data in the future will become a key factor to maintaining competitiveness, business growth and further innovations. Thus, the organizations that will systematically use Big Data in their decision-making process and planning strategies will have a competitive advantage.
Źródło:
System Safety : Human - Technical Facility - Environment; 2020, 2, 1; 166-173
2657-5450
Pojawia się w:
System Safety : Human - Technical Facility - Environment
Dostawca treści:
Biblioteka Nauki
Artykuł
Tytuł:
BDOT500 Database of Physical Topographic Objects – Basic Qualitative Analysis
Bazy danych obiektów topograficznych BDOT500 – podstwowa analiza jakościowa
Autorzy:
Ślusarski, M.
Powiązania:
https://bibliotekanauki.pl/articles/100278.pdf
Data publikacji:
2015
Wydawca:
Uniwersytet Rolniczy im. Hugona Kołłątaja w Krakowie
Tematy:
topographical object database
metadata
spatial data quality
metadane
topografia
BDOT500
Opis:
Official databases, which gather spatial data, should include sets of metadata, which serve to describe the information within. The fundamental element of a metadata set consists in the features describing the quality and relative importance of geospatial data. In the present work, we propose a method for the evaluation of database quality pertaining to databases of topographic objects (BDOT500) based on four criteria: location accuracy, completeness, validity (in the sense of being up to date) and logical coherence. Overall quality evaluation of the BDOT500 database was carried out using the calculation of two estimation factors: average absolute value and the coefficient of average variation. The method described herein is a useful device, which allows for a quick and credible evaluation of the BDOT500 data quality at a basic level. Applying the method does not require involving any additional means, as all the necessary information is already recorded inside the database.
Urzędowe bazy danych gromadzące dane przestrzenne powinny zawierać zbiory metadanych, służące do ich opisu. Zasadniczym elementem zbioru metadanych są cechy opisujące jakość i ważność danych przestrzennych. W pracy zaproponowano metodę oceny jakości baz danych obiektów topograficznych (BDOT500) opartą na czterech kryteriach: dokładności położenia, kompletności, aktualności i spójności logicznej. Całościową ocenę jakości danych bazy BDOT500 przeprowadzono poprzez obliczenie dwóch estymatorów: wartości przeciętnej absolutnej oraz współczynnika zmienności średnich. Przedstawiona metoda to sposób na szybką i wiarygodną ocenę jakości danych BDOT500 na poziomie podstawowym. Zastosowanie tej metody nie wymaga angażowania dodatkowych środków, wszystkie potrzebne informacje zapisane są w systemie bazy danych.
Źródło:
Geomatics, Landmanagement and Landscape; 2015, 1; 69-75
2300-1496
Pojawia się w:
Geomatics, Landmanagement and Landscape
Dostawca treści:
Biblioteka Nauki
Artykuł
Tytuł:
Big data – wyzwanie dla rachunkowości zarządczej
Big data as a challenge for management accounting
Autorzy:
Burnet-Wyrwa, Wioletta
Powiązania:
https://bibliotekanauki.pl/articles/590501.pdf
Data publikacji:
2017
Wydawca:
Uniwersytet Ekonomiczny w Katowicach
Tematy:
Big data
Jakość danych
Ograniczenia
Rachunkowość zarządcza
Wyzwania
Big Data
Challenges
Constrains
Data quality
Management accounting
Opis:
W opracowaniu zaprezentowano wyzwania, z jakimi mogą mierzyć się specjaliści z rachunkowości i menedżerowie przy włączaniu danych pochodzących ze źródeł nieustrukturyzowanych do systemów rachunkowości zarządczej oraz wyzwania i ograniczenia związane z ich pozyskiwaniem, przetwarzaniem, wizualizacją i dzieleniem wyników. Pokazano także wieloaspektowy wpływ, jaki technologie big data wywierają na zakres kompetencji wymaganych od specjalistów z obszaru rachunkowości zarządczej.
The paper presents the challenges that accounting specialists and managers may face when integrating unstructured data into management accounting systems, and the constraints associated with acquiring, processing, visualizing, and sharing results. The multi-faceted impact Big Data has on the competencies required of accounting professionals had been presented.
Źródło:
Studia Ekonomiczne; 2017, 341; 45-53
2083-8611
Pojawia się w:
Studia Ekonomiczne
Dostawca treści:
Biblioteka Nauki
Artykuł
Tytuł:
Ocena przydatności OpenStreetMap jako źródła danych dla analiz sieciowych
Assessment of OpenStreetMap suitability as a data source for network analysis
Autorzy:
Cichociński, P.
Powiązania:
https://bibliotekanauki.pl/articles/346835.pdf
Data publikacji:
2012
Wydawca:
Polskie Towarzystwo Informacji Przestrzennej
Tematy:
jakość danych
normalizacja
dane przestrzenne
OpenStreetMap
analiza sieciowa
data quality
standardization
spatial data
network analysis
Opis:
Od kilku lat użytkownicy oprogramowania GIS mają do dyspozycji zbiór danych stanowiący swego rodzaju alternatywę dla produktów komercyjnych. Jest nim OpenStreetMap (w skrócie OSM) – projekt mający za cel zbudowanie edytowalnej i dostępnej bez ograniczeń mapy świata. Mapa taka tworzona jest na podstawie danych z ręcznych odbiorników GPS, zdjęć lotniczych oraz innych dostępnych źródeł danych, a także szkiców wykonywanych w terenie. Zgromadzona informacja zapisywana jest w jednej, centralnej bazie danych, z której może być pobierana w postaci obrazu mapy zaprezentowanego przy użyciu wybranej symboliki lub danych wektorowych. Dane z takiego źródła mogą być zastosowane między innymi do przeprowadzania analiz sieciowych, z których najszerzej wykorzystywana i najczęściej spotykana jest funkcja znajdowania optymalnej trasy pomiędzy dwoma punktami. Aby wynik takiej analizy można było uznać za wiarygodny, użyte dane muszą się charakteryzować odpowiednią jakością. Ponieważ baza danych OSM budowana jest przez wolontariuszy, nie są sformułowane żadne plany jej systematycznego rozwoju oraz brak jest odgórnej kontroli jakości. Dlatego w pracy zaproponowano metody i narzędzia, które pozwolą zweryfikować przydatność zgromadzonych do tej pory danych dla celów analiz sieciowych, jak również poprawić znalezione błędy. Spośród rozważanych dla danych geograficznych aspektów jakości, szczególną uwagę zwrócono na: kompletność, dokładność położenia, spójność topologiczną i dokładność czasową. Dwa pierwsze problemy wymagają w praktyce porównania z rzeczywistym przebiegiem dróg i ulic. Do rozwiązania problemu spójności geometrycznej zasugerowano wykorzystanie odpowiednich reguł topologicznych badających wybrane relacje przestrzenne pomiędzy obiektami geograficznymi. Po zasygnalizowaniu wykrytych w ten sposób błędów, można je poprawiać zarówno metodami automatycznymi, jak też metodami półautomatycznymi bądź ręcznymi.
For several years now, GIS users have at their disposal the data set constituting some kind of alternative to commercial products. It is the OpenStreetMap (OSM) – a project with the objective to build an editable and available without restrictions map of the world. This map is created based on data from handheld GPS receivers, aerial photographs and other available data sources, as well as sketches made in the field. Acquired information is stored in one central database, from which it can be downloaded as a map image using selected symbols or as vector data. Data from this source can be used, among others, for network analysis, of which the most widely used and the most common function is finding the optimal route between two points. For the result of this analysis to be considered reliable, the data used must be of appropriate quality. Because the OSM database is built by volunteers, there are no plans of its systematic development and there is no top-down quality control. Therefore, the paper proposes methods and tools to verify the suitability of the so far collected data for the purposes of network analysis, as well as to correct errors found. Among the quality aspects considered for geographical data, particular attention was paid to the completeness, positional accuracy, topological consistency and temporal accuracy. The first two problems require in practice comparison with the actual course of roads and streets. To solve the problem of geometric consistency, the use of relevant topological rules examining selected spatial relationships between geographic objects was suggested. When the errors detected this way are marked, they can be corrected using automated methods, and also semi-automatic or manual methods.
Źródło:
Roczniki Geomatyki; 2012, 10, 7; 15-24
1731-5522
2449-8963
Pojawia się w:
Roczniki Geomatyki
Dostawca treści:
Biblioteka Nauki
Artykuł

Ta witryna wykorzystuje pliki cookies do przechowywania informacji na Twoim komputerze. Pliki cookies stosujemy w celu świadczenia usług na najwyższym poziomie, w tym w sposób dostosowany do indywidualnych potrzeb. Korzystanie z witryny bez zmiany ustawień dotyczących cookies oznacza, że będą one zamieszczane w Twoim komputerze. W każdym momencie możesz dokonać zmiany ustawień dotyczących cookies