Video Archives of Live Cameras at HWF

When we first started showing the Hornby Island eagle nest to the world in 2006, one of the tasks I had was to travel to Hornby Island each week with 3 new hard disk drives to swap with the 3 that had been filled with local video archives the previous week. From day one, the long-term goal of Hancock Wildlife Foundation has been to capture and preserve video archives of our various cameras, even though to date we have not had the facilities to do anything with them. In addition, the facility for "instant replay" has been available to our viewers in the past couple of seasons from the video services provider (Zaplive.TV) that the video distribution was done through. That is no longer happening due to Zaplive.TV leaving this business.

A question in our Technical Questions discussion forum today has prompted me to do this article to explain about video archives and what we have and don't have of them. If you read on, I'll include a funny story about them too.

The Technology of Archives

There are two completely different technologies involved in our archives:

  1. Server computer archives at the distribution system
  2. Consumer quality archives at the camera site

The major difference is cost. The server archives are much more expensive than the facility we use at the camera site for the simple reason that they use components that are much faster and more responsive than the ones we use at the camera site.

The server facilities must not only record the video that is coming in, they must also allow instant replay and retrieval of random snipits of the video by many people at once. The storage facilities that allow this to be done are as much as 100 times as expensive as what we use at the camera site. This means that if we put together a computer with 10 hard drives in it totaling 3 Terabytes of storage and costing about $1500, the equivalent server system might be $150,000. In fact, while we "consumers" can purchase multi-terabyte hard drives, the server systems are still mostly in the 300 to 750 Megabyte range, so they not only have to purchase more expensive drives, they have to purchase, house, power and interface to more drives to get the same storage. When we were purchasing 300 Megabyte drives, the typical server drive was 75 Megabytes.

Just picking up a server drive and feeling the difference in weight between it and a consumer drive gives an indication of the value difference. In addition to simply being made of sterner stuff, the server drives spin faster (15,000 RPM vs typical 7200 RPM) and have a different data interface that is faster and more capable (SCSI or SAS vs IDE and SATA) and typically are mounted in their own hot-swapable drive bay with real-time monitoring of the drive's health and in systems with "hot spares" so the system reliability is far higher than the less expensive consumer systems can achieve. I have server drives that have been functioning flawlessly for almost 10 years, and have had consumer drives fail in months.

There are other differences too. We store "high resolution" video compared to the much lower "broadcast" resolution that the servers receive and archive. This gives them a bit of an advantage since they don't need as much storage space, but they still need a lot in server system terms.

History of the HWF Archives

Initially, the 3 hard drives I took to Hornby Island each week were 200 Gigabytes each (and then 260 Gigs) and cost just over $100 each as "raw" drives. This was the "sweet spot" for size vs price vs reliability at the time. We were capturing 640x480 video from the one camera, and the 3 drives held about a week's worth of video. The capture machine was a separate Linux server that took a tap off the camera and microphone and captured it separately from the Windows Media encoder box that fed the distribution to the world. This setup has remained fairly consistent since then; use Windows systems to encode, and Linux systems to capture archives. We've moved from Windows Media to Adobe Flash, but otherwise things are pretty much the same - at least they were until we started using some of the new "IP" cameras last year, but that's another story.

The move from Hornby to Sidney that year (2006) included moving the archive server, but there was a short period when we were sending out but not archiving. The re-distribution facility did not have archive capabilities.

The size of the hard drives we could purchase at the "sweet spot" keeps going up. As noted above, we started with 200 Gigabyte drives and switched to 260's. Then 300's, 320's and 500's. This past year we've been purchasing 1 Terabyte drives, and I just purchased some 2 Terabyte drives (2000 Gigabytes) for the Port Moody facility. In addition to purchasing larger drives, we moved to putting them into systems with 10 drives each - first as 3 Terabytes in a RAID (Redundant Array of Inexpensive Drives) array using 300 Gig drives, then one of 5 Terabytes using 500 Gig drives.

The reason for using larger drives and putting them into arrays in a single machine was two-fold:

  1. it meant fewer trips to remote sites to swap in new drives, and
  2. we were using more and higher resolution cameras at each site so needed the space.

The funny story I promised has been told before, but it bears telling again - at least it's funny looking back on it anyway.

I had the video archive systems from several years at my home while I did some transfers and played around at doing up some videos from them. Dealing with these huge amounts of video, literally months worth from each season, is a daunting task. All the systems - 6 of them - were on and running in my basement, and of course drawing power. It turns out that the amount of power they were drawing (plus the other 4 of my own systems) brought my home to the attention of the local municipal inspectors through a law that allowed them to look for illegal uses and potentially dangerous uses of power. My use was over 3 times that of a typical urban dweller so was flagged.

Thus I answered the door one morning to receive a notice from the local police (RCMP) that the next day I'd be inspected, and if I didn't allow the inspectors in, they'd get a search warrant to compel me. I offered to show the officer around the house and explained about the "eagle video" that was likely the cause of the use of power, but she'd have nothing of it and simply said she'd be back in 24 hours with the rest of the inspection team. 

A bit of paranoia crept into my life and I went around the house looking for things that might be "illegal" or against the electrical code - removed a couple of instances where power bars were daisy-chained, etc. and then sat down to think about this whole thing. I wrote a note to a mail list of people similar to me (work-at-home technology people) telling them about what was happening, and sent it out. I'd had professionals install extra circuits in my office and spare room for the computers, so was not really concerned over that aspect of things, but you just never know what might happen when you let a bunch of police and municipal inspectors into your home...

I then phoned a local newspaper reporter I know and asked her what she knew about this type of thing. The inspectors were apparently looking for marijuana (BC Bud) grow ops and my reporter friend had been asking to go along on one of these inspections for over 6 months. She asked if she could come and visit me during the inspection. I said yes.

I also received a call from a local TV station about the inspection and asking if they could come. This request came about because one of the people on my mail list was talking to their reporter when the email came across his screen - and told her about it. "Do you think he'd want cameras there?" was her question, and my friend David said "yes".

The inspection came and went - and the bureaucrats blustered their way through having the reporters there which was obviously not something they expected - and of course my home passed with flying colors, but at the time it was just a bit unnerving.

The point here is that the video archives are not just a couple of hard drives. They currently constitute several huge computers and a number of separate disk drives that are not installed in systems. Most are currently stored, not powered up, in the "Eagle War Room" at David's facilities at Hancock Ranch. Some of the more recent drives are here at my home and the current crop are at the actual sites.

One major change we've made, well a change to and from really, is that we're back to storing video on individual drives instead of RAID arrays. The drives are now much larger than they were when we first started doing up the arrays, and we've had at least one major disaster with an array where we lost the whole set due to two of the 10 drives failing at the same time. The problem is, hard drives do fail, and in purchasing some of the most recent offerings of the largest drives, we've run across models that have had higher than expected failure rates several times. The use of RAID 5, where in our case 10 drives are used, means that we get 9 drives worth of storage and any one of the drives can fail at any given time and we won't lose anything. Problems only arise when more than one drive fails before a new drive can be put in place of the failed one and the array "rebuilt" by the software system. We didn't put in a new drive fast enough in this one case.

Data "Bit Rot"

Data can disappear for any number of reasons, not just hardware failure.

As part of a clean up of my office at home, I just finished bulk-erasing and throwing out several hundred 5 1/4" floppy diskettes as well as a lot of 3 1/2" ones. Along with those, I also destroyed and threw out quite a few CDs that had data on them. All the relevant data had been copied to hard disk drives and the originals were no longer needed. I also threw out a bunch of digital tapes, some from the 1980's. Most of these I didn't get a chance to copy the data from because the tape drives that made them no longer work, and are no longer made. I could have purchased parts for the drives and tried to get them to run, but in some cases the computers that I had interface cards for no longer existed, and today's systems don't take those cards. This is "bit rot" - the inability of today's systems to read and use yesterday's data formats, either because the hardware is no longer made, or because the device itself is broken and not fixable. It's hard to purchase a system today with a 3 1/2" drive in it - and impossible to get one with a 5 1/4" - and don't get me talking about the old 8" ones that one of my friends still has in his warehouse.

Another reason for bit rot is changing data formats. One set of tapes I threw out were recorded on drives that had built-in hardware compression. I could still purchase drives that could use the tapes, but the original drive manufacturer for the original drives no longer made one with that particular compression facility, and no other drive could read the old tapes. This is a reason I no longer use any "proprietary" compression or storage facilities for anything. If the method is not a published standard with multiple vendors, then I don't use it.

The same thing happens with hard drives. We've gone through many different hard drive interfaces and form factors (Ken still has an 8" 8 megabyte drive for example,) and moved from the initial 5-8 Megabyte drives I sold back in the early 1980's to the announcement this week of a 3 Terabyte drive by Western Digital. I have a 5 1/4" "full height" 800 megabyte drive I use as a door stop.

Most recently, computer manufacturers are limiting the number of IDE drive interfaces in their new systems. In fact the most recent system I put together didn't have any, and we have many of these IDE interface drives with video on them. The new SATA drive interface is also undergoing its own changes - from 1.5 Mbps to 3.0 Mbps and maybe faster. Who knows what the next generation of hardware will bring, and knowing the computer industry it won't be long before we'll be unable to use today's drives in tomorrow's machines.

The new 3 Terabyte drives are larger than many of today's computers can handle without special drivers. The same thing happened when drives moved beyond 30 Megabytes many years ago, and again when we got to in excess of 100 Megabytes. 

It's a never-ending challenge to keep the data we have on hardware that we can use and grow with. The good thing is that today's huge drives will hold as much as several of yesterday's drives - and cost less.

The bad thing is that today's cameras are higher resolution and require more storage for the same length of time, so we're still having to purchase many drives for a season for each camera.

We've lost some archives due to hardware failure. We've failed to do archives of some cameras due to costs, and we've lost access to some archives due to failure of those who created them. We've also lost the ability of our viewers to see recent archives online, which is the subject of the question that got me into writing this article. We need to move to the next level in our creation and use of archives - our own archive facilities.


The Next Step - HWF's Production Facility - $$

David and I have recognized the need and had the design for a production facility (members only - membership is free) in mind for most of the past 5 years; ever since we first realized just how valuable the archives are.

We are creating the archive video for a reason. It is valuable in its own right as historic footage, but it is also valuable as media to be turned into education materials and research documentation. Our proposed Production Facility is key to using the archives and to codifying and rationalizing their production and distribution.

For some of our cameras we've left the archival activities up to the distributor of our signal - Zaplive.TV - but we have lost access to that facility. We need to replace it with a facility we control so this problem never happens again. Again, our proposed Production Facility would be how we would accomplish this.

And the fact is, if we are to keep the archive video we have, we will have to migrate if from the older technologies and aging drives it is on, to newer technology and drives. We will also have to back it up - make duplicates of it in case of failure. This too is one of the functions of our proposed Production Facility.

We've done what we can within the budget we have. The drives the video is stored on are relatively inexpensive, and we're no longer purchasing new systems for each 10 drives as we did for a while. We're simply grabbing the video and putting the drives on a shelf until we can put them to use.

The production facility, besides providing a central place for archives, would also have the ability to create new content channels in real time from mixes of both live cameras and archival video, as well as local and remote commentators. In effect, we're talking about a TV production facility for live wildlife video. Educators would get feeds without the ads, which is what they've asked us for all along, and viewers as well as researchers would get access to the archives of all our cameras, past and current.

We're looking for sponsors to fund our production center. In the mean time, keep you donations coming in because we continue to need new hard drives for the archives we can create.


Story Options


Trackback URL for this entry:

No trackback comments for this entry.


The following comments are owned by whomever posted them. This site is not responsible for what they say.
Authored by: HikerBikerGram on Sunday, October 24 2010 @ 08:17 AM EDT Video Archives of Live Cameras at HWF

Thanks for the information. I see it is very involved and will take time and money for seek points and hot spotting to be in place again. Thank you for sharing your story, that sounded very frightening. You handled the situation very well, it is always good to have a few extra friends around when you are being invaded.
[ # ]

Please Donate

Please Donate!

Current & Ongoing Promotions





My Account

Sign up as a New User
Lost your password?