The main goal of this research is the elicitation of guidelines for mapping one specific aspect of data quality, namely attribute accuracy. This is the first time (to the knowledge of the authors), that design guidelines for mapping attribute accuracy have been based on an empirical investigation. Data for the main experiment were collected on-the-fly and included the response times, and correctness and confidence of both siting decisions. Using a repeated trials design, test maps on which the siting decisions were based varied the value, texture or saturation used to display attribute certainty. The paper will focus on the interpretation of the statistical results and the formulation of the rules based on these results. One result indicates an apparent acuity for associating value with attribute certainty more quickly than either texture or saturation. Other results indicate that differing graphical treatments are associated with differences in the frequency of correct responses. The most intriguing result indicates that test subjects respond to maps embedded with attribute accuracy more quickly than to maps embedded with other forms of attribute detail. The implication that data quality information is not comprehended as an increase in visual complexity, and this will be discussed in detail.
Information spaces are characterized by the information each contains and by a certain organizing structure. The structural characteristics of information spaces can differ dramatically. Hypermedia researchers have been struggling with the "lost in hyperspace" phenomenon for years. One proposed approach creates "maps" showing the structure of hypermedia in two-dimensional graphical form. This is an example of a process called spatialization, i.e. the application of spatial metaphors to organize or understand collections of geographic or non-geographic information. The justification is that the spatial metaphor is readily comprehended by viewers, particularly when it is accompanied by visual displays. The spatialization concept is becoming increasingly popular, yet it remains ill- defined. In the paper we present a rigorous definition of spatialization and a case study completed with real non- geographic data. We present a number of visualization techniques demonstrating that cartographic logic can improve spatialization methods. Our research is based on the assumption that cartographic principles of scale and generalization can have a major clarifying impact on this emerging method of data organization.
A digital library should be more than a physical library in electronic form. In a digital library, traditional distinctions between books, digital spatial coordinates, maps and satellite imagery should become transparent to library patrons. It should be possible to retrieve maps and images and overlay them with digital attributes from another data source. The digital library catalog should include digital files that are archived in depositories distributed across the nation. It should be possible to browse spatial metadata prior to downloading files. Patrons should be able to visit the library without ever leaving their own offices. Our presentation will provide an overview on the Alexandria Digital Library project (ADL), providing comprehensive library services of a map and imagery library over the Internet. We will demonstrate the publicly accessible Web implementation. We will describe the origins of ADL, and of merging maps and images into the library information mainstream. We will describe the development of the ADL prototypes, and focus on the features of the current implementation that distinguish ADL from other efforts. We will present research issues raised by ADL their likely impact on the accessibility of spatial data to earth system scientists.
This paper provides an update on the Alexandria Digital Library project (ADL), centered at University of California-Santa Barbara with a satellite site at University of Colorado-Boulder. We discuss the components of the publicly accessible Web implementation. We describe the origins of ADL, its objectives of providing access to the services of a map and imagery library over the Internet, and of merging maps and images into the library information mainstream. We describe the development of the ADL prototypes, and focus on the features of the current implementation that distinguish ADL from other digital library efforts. The paper ends with an overview of outstanding research issues raised by ADL and other related projects, and of the impact such developments are likely to have on the accessibility of spatial data.
This research investigates the impact on spatial
decision support of embedding map displays with attribute
accuracy. Subject testing has established guidelines
to incorporate GIS graphical defaults. The experiments tested
cartographic symbol characteristics
(value, texture and saturation), the level of map detail,
and the difficulty of the
decision-making task. The experiment simulated
the siting of a park and the siting of an airport. Test subjects were
presented with specific criteria and asked
to decide which sites met all criteria, some criteria, or none. Data were
collected to record response time, correctness
and confidence of both siting decisions,
using a repeated trials design
varying the value, texture or saturation
used to display attribute accuracy.
Results indicate that attribute accuracy can
be embedded in GIS displays without confusing map
readers, when specific symbol schemes are followed. However,
the results vary with the difficulty
of the decision-making task.
The testing design will be presented in this paper,
along with statistical results and implications
for setting GIS graphical defaults. This research uncovers new
information about assimilating data
quality information into graphical displays.
On a practical level, it establishes
symbolization schemes for mapping attribute accuracy.
These guidelines should be incorporated as GIS graphical
defaults in anticipation of digital datasets that
include data quality information. This research continues the trend re-
establishing empirical testing as a valid paradigm
for eliciting and formalizing cartographic (design) knowledge.
Browsing through very large archives can frustrate and eventually impede the retrieval of information. The paper presents an approach to search catalogs of very large electronic archives, that combines computer visualization with the descriptive and analytical power of geography. This method is called "spatialization", referring to the application of spatial (and visual) metaphors for organizing large volumes of information that are not necessarily spatial in nature. The power of the metaphor follows from many commonly accepted geographic principles. One of these is Tobler's Law, that items closer together are more similar than items located farther apart, which is useful for categorizing 'regions' in the catalog. Another is the concept of scale-dependence, related to the details which emerge as one observes a geographic landscape more closely. A similar hierarchy can be apparent in archive catalogs, as a person refines a search. Other principles may also apply, for example intervisibility analysis, applicable to cross- referencing various portions of the catalog using a geographic metaphor of "line-of-sight". Previous attempts to apply spatial and particularly geographic metaphors to large archives have only partially succeeded. This is due in some part to the technique utilized to 'locate' archive items within the confines of the metaphoric 'space'. We apply multidimensional scaling to descriptive keywords, establishing a numeric coordinate system whose properties can support the principles identified above. We implement the metaphor of digital terrain representation to a catalog of roughly 100 news stories. This is not a truly large archive, however it demonstrates a proof of concept, and provides a means to identify potential limitations and directions for future research. The paper presents specific details on constructing this spatialization, discusses various problems encountered , and introduces ways in which geospatial information technology may be applied in the process.
Today the processing and analysis of geographical information on the Internet is complicated by an increasing volume of information. We present a system to directly manipulate geographical data by using object-oriented approaches and graphic user interface (GUI) design. The study concentrates on vector data and map overlay operations. A case study has been conducted by using this system for potential site selection in the Ellington, Connecticut area. The GUI design of the system uses icons to represent the geographical data and their operations. Object-oriented approaches are adopted in establishing a knowledge-base GIS system. This study suggests that the next generation of GIS user interfaces should provide an intelligent agent to assist users to search, query and operate on data.
Discussions about the National Spatial Data Infrastructure and its impact upon the future are most often carried on in a dialog between data producers and current data users, including educators who are researchers from university settings. The potential data users are most often considered to be individuals or agencies who plan to adopt or have recently adopted NSDI-related technologies. What about the longer term future, that is, the NSDI users who are presently in school? The ultimate assurance of successful adoption of a new or emerging phenomenon is that it becomes 'transparent' in everyday use. Weiser (1991) gives examples of a street sign or an elevator panel, commenting that we use these navigational devices without even thinking about them, and thus they are (as he puts it) embedded into the fabric of everyday life. To integrate spatial data use in the broadest societal context, we must embed both training and awareness of spatial data and analysis into all levels of education, including K-12, college and university, and lifelong learning environments.
To service those who need digital data, new products appear with increasing frequency, and one can access increasing quantities of geographic data on the Internet. Paradoxically, as more data become available they become more difficult to locate, to download, and to certify as valid. A major challenge in the coming decade is to enhance the accessibility, communication and use of geographically referenced data. The Alexandria Digital Library Project implements a software testbed delivering comprehensive library services to browse and retrieve maps, imagery, historical air photos, and other georeferenced digital data distributed on local, and wide-area (Internet) networks. A working prototype of the Library is complete. User evaluation plays an important role in testing the effectiveness of current software functions for browsing environmental data. The current interface design embeds online user evaluation mechanisms, including object oriented interactive logging to monitor use patterns and use error patterns. Interactive dialog tools enable users to annotate specific system commands and behavior that delight or confuse them. Logs and user dialog are analyzed to guide interface refinement. The intention is to optimize an interface for browsing environmental data on the Internet. The interface and user evaluation tools will be demonstrated at the conference.
Both uncertainty and errors are inherent in spatial databases. The processes of observing, measuring, interpreting, classifying and analyzing data among other operations give rise to systematic and random errors. Some errors may be quite large (blunders) and easily detectable. Other errors and uncertainties in spatial data are more subtle and are not easily detected or evaluated. Casual users of GIS may not be aware of their presence or even the possibility of their existence. These are the most problematic and the ones we must try hardest to illuminate. Graphical methods when used in conjunction with error analysis provide a means for identifying both gross and subtle errors and evaluating the uncertainty in geographic data. The implications for spatial analysis and spatial decision making of error and uncertainty in geographic data can be identified in theoretical work (for example in spatial statistics), in domain-specific applications ( for example in environmental resource management) and in empirical testing (for example in recent cartography and cognitive science research). This chapter outlines a rationale for the use of graphical methods, highlights several historic and recent examples, develops a framework for graphical methods, and points to research challenges for the future and the potential for new techniques arising from technical innovations.