A Sony HVR-Z1U camera. This device is a digital video workhorse at the SCC, and relies heavily on digital video tape... something which could be rather hard to come by in the near future.
My heart, thoughts, and a donation goes to those affected by the Earthquake, Tsunami, and now radiological crisis that Japan must grapple with. It’s not exaggeration to say this turn of events is truly unprecedented. Sitting thousands of miles away, and only observing the events through websites and television screens, I’m aware that I cannot possibly grasp the ordeal that survivors now face.
With that preface, it’s difficult to even think at this point of how the disaster will inconvenience those of us far removed. However, there will be a rather significant impact for quite some time, given our technological dependencies in a digital world, the number of electronic components and supplies that are produced in Japan, and how we use those components to capture our current history and cultural heritage.
Our first hints of trouble began with an advisory issued to consumers of magnetic tape media. Sony, a major manufacturer of various varieties of tape media as well as semiconductors, optical discs such as DVD and Blu-ray, and electronic components, has been hit hard. Sony was forced to shut down a number of factories in the region while recovery efforts continue. The earthquake has forced a halt to production in various manufacturing facilities in Japan, including those of magnetic media manufacturers, and suppliers are now warning of an impending shortage and possible price spikes:
“Our industry has already been affected by a halt in media manufacturing operations – professional media supply shortages are evident, namely HDCam SR,” explained a post on the Comtel Pro Media web site. “Worldwide stock shortages present a realistic threat to our industry and the immediate needs of the television and motion picture production.”
Of particular note is a shutdown of the Sony Corporation Sendai Technology Center, currently the only facility in the world producing HDCAM-SR tapes.
Read the rest of this entry »
As of January 18 of this year, the National Science Foundation has enacted policies that ensure researchers take seriously the need for data sharing and dissemination. According to the new mandate:
Investigators are expected to share with other researchers, at no more than incremental cost and within a reasonable time, the primary data, samples, physical collections and other supporting materials created or gathered in the course of work under NSF grants. Grantees are expected to encourage and facilitate such sharing. See Award & Administration Guide (AAG) Chapter VI.D.4.
To that end, researchers are now required to submit a Data Management Plan with their grant requests, detailing how the project will comply with research sharing guidelines set forth by the NSF.
These requirements leave researchers with a choice: either come up with a plan on their own, or seek help from their institutions on a comprehensive data sharing and preservation model. Fortunately, the resources and tools exist at Rutgers for its researchers to easily take the latter route.
In anticipation of these data sharing requirements, the university has setup a site to guide researchers through the ins and outs of data sharing. The Rutgers University Research Data Archive site clearly explains the importance of sharing and preserving research data, and details some of the current offerings for researchers who need a platform to share their research data to comply with NSF guidelines.
It goes without saying that one such option listed on the site (and the platform I recommend) is the Rutgers University Community Repository. In anticipation of this need, the RUcore team has developed the RUResearch Data Portal, a section of our digital repository meant specifically for serving research data needs.
Already trusted by faculty members to store their academic publications, and the mandatory platform for Theses and Dissertations in the Graduate School of New Brunswick, RUResearch is a natural extension of RUcore’s mission to preserve and make accessible the university’s academic output from a centralized resource that adheres to established digital preservation standards. With RUResearch, you can not only be assured of meeting NSF’s requirements on paper, but you will also have the security of knowing your research data is truly safe and preserved.
More information on data preservation services can be found on the Rutgers Libraries Website, including dates for in-person presentations on the services we offer the academic research community. And, if you are a researcher interested in how RUcore and the RUResearch platform can help you, contact our Data Services Librarian, Ryan Womack, and he will be able to give you the information you need to get started.
GMail kept users notified through a status page of their ongoing recovery efforts.
This past week offered up a little dose of panic to an estimated tens of thousands of users to Google’s free Gmail service, when they logged in to discover that all of their e-mail was missing. According to Google:
We released a storage software update that introduced the unexpected bug, which caused 0.02% of Gmail users to temporarily lose access to their email. When we discovered the problem, we immediately stopped the deployment of the new software and reverted to the old version.
The drumbeat continues to sound for the preservation of obsolete and endangered moving image formats, particularly videotape. As older tape formats become unplayable, either through decay or lack of equipment to play them back, the urgency grows to find ways to preserve their content using modern digital formats.
The problem has been considered by multiple organizations acting separately over the past decade, and all of them have wrangled over the same question: What digital format should be used to preserve this content in the digital space, and help ensure that we aren’t finding ourselves in the same obsolescence predicament right away? Interestingly, those analyzing this problem and making decisions for their respective organizations have often come up with different answers.
Those differing opinions, and the rationale behind them, were the subject of a talk held earlier this month in Philadelphia, at the Association of Moving Image Archivists / International Association of Sound and Audiovisual Archives (AMIA/IASA) 2010 Conference. Representatives from the respective digital preservation projects underway at the Library of Congress, Rutgers University Libraries, and Stanford University were each on hand to offer their perspectives and the paths their organizations took for digitally preserving their video.
The abstract of the talk, as well each presenter’s slides and notes, can be found here:
I feel an important conclusion to take away from this talk is that there isn’t always a single right answer to the digital preservation conundrum. There is a common desire among preservationists to have and use a widely accepted standard format for keeping our digital objects safe in the long term. However, while formats and standards can be recommended and can work very well for a wide variety of use cases, there are always those local requirements and special needs that need to be considered, and adjustments made accordingly.
Fortunately, a great deal of progress has been made in the last several years, as those who were once wading into this problem alone have experimented and learned from past mistakes. It’s venues like this which permit that knowledge and experience to be shared, so that those preservationists just starting to consider the problem can use that wisdom, and have multiple case studies to consider in making decisions of their own.
Some of our best digital preservation projects have been the direct result of collaboration; working with dozens of separate entities that all have valuable materials that they want to share with the online world. That collaboration brings some challenges though, and one of biggest problems we’ve run into has been how people name files after they’ve created or digitized them.
For experienced computer users who store lots of valuable informartion digitally, it goes without saying that clearly naming files is extremely important. Often, the filename is the first thing a user sees that identifies what’s in a file; the information it contains. Without any other cataloging system in place, file names become the way to figure out what’s inside the hundreds of thousands of individual files that can sit on the average persons’ desktop computer, and having countless “untitled” or ambiguously-named files can make finding the information you want nearly impossible.
Fortunately, modern computer operating systems give people a wide latitude in how they can name files. Most people have a file naming method that works best for them, and for the most part, individual systems can work well, so long as they stay consistent and aren’t too hard for most people to easily comprehend. However, things can get a bit more tricky when such files are destined for a digital library, online repository, or other type of internet-based storage and delivery medium. When these types of architectures come into play, some of that wide latitude that modern computers give us in naming files can cause some complications. Web-based content management systems aren’t always as flexible or forgiving with filenames, and can sometimes reject or even mangle files that are more liberally-named.
For this reason, it’s helpful to establish and follow a few simple ground rules when working on a digital preservation project that requires file handling.
Enter your email address to subscribe to this blog and receive notifications of new posts by email.
Email Address
Subscribe