Channeling data with a bird’s eye view of our organizations' strategies and processes as well as those of the wider community will lead us on a journey of new discoveries rather than down the beaten path of ownership resolution.

How does your firm handle data? Is this the most effective way, or are there better methods?

Before addressing these questions, it is prudent to take a step back and develop an understanding of the transitions we as humans make as we leap back and forth from print to digital domains. For thousands of years, we have relied on a tangible source of communication. From the papyrus reeds of early Egypt and Gutenberg’s printing press to today’s laser printers, data verification and integrity have been psychologically dependent on paper.

If we take a deep look at computer systems, we can see that they try to duplicate paper handling methods or otherwise “humanize” their complex environments. Connotations such as “desktop,” “folders,” “trash can” or the green term “recycle bin” are synonymous with today’s operating systems. The references are so subtle that we programmatically follow the instinctive behaviors we have all grown accustomed to in the real world.

A simple example can be found in the use of folders. When you create a folder and place documents in it, you can easily relate this experience to its physical counterpart. When this method changes to storage within a database, for example, so does the average user’s perception and trust of the data.

As digital data transforms from file folders to database storage and beyond, the integrity and psychological acceptance of the data produced from these sources will force us to change our behavior patterns.

Don’t get me wrong-the advent of computer technology revolutionized our planet! However, in a rush to get buy in, applications began to mimic user needs to the point where replicating human behavior as opposed placing the digital data it in proper perspective took precedence. Hence, when the time comes to enhance your company’s technology, the engrained behavior pattern will present an obstacle.

Addressing issues regarding the most efficient use, format and final destination of data are somewhat uncommon discussions in a multidisciplinary environment. Transitioning data from one source to another is almost always left up to the core handlers of each profession. This is particularly true when involving projects that include CAD and its migration to the world of GIS. I recently attended the SERUG conference in Jacksonville, Florida where presenter Daniel Johns, a GIS Analyst II with Clay County Utility Authority in Florida, discussed the excruciating task of auditing and painstakingly correcting CAD files by way of a host of scripts prior to uploading them into a geographic system. I know he is not alone.

This blog was created to discuss issues arising from data that are created and disseminated within our organizations. It serves to create awareness among surveying, mapping and GIS professionals and a resolve to take action.

Efficiently capturing and routing data throughout an organization, county and/or city is the goal. For this to occur, data should be captured once, closest to its source, appropriately and accurately. Additionally, to release the anxiety that the data was not created by you or your department-and, hence, its dependability stigmata-levels of accountability should be woven throughout the inherent processes. Channeling data in this way, with a bird’s eye view of the organization’s strategies and processes as well as those of the wider community, will lead us on a journey of new discoveries rather than down the beaten path of ownership resolution.

Creating this paradigm shift in our current thought processes requires a wide cross section of disciplines such as business, technology and the engineering community. Such a combination would bring to the table a holistic approach to providing a vision for the data issues we all face.

Surveying, mapping and GIS firms benefit by being able to derive and direct the most effective and efficient use of data standards at every level. Ultimately, it is the way we use information that will lead us into the next millennium-not the multitudes of software created to manipulate it.

What do you think? Please post your comments below.