|
|||||||||
|
Thread Tools | Search this Thread |
September 14th, 2007, 11:13 AM | #1 |
New Boot
Join Date: Mar 2005
Location: London, Ontario Canada
Posts: 22
|
Fast network storage?
Thanks in advance for any thoughts on this subject!!!
I've been really struggling with this. I'm looking for a system that has plenty of storage, say 1-2 TB or more, shared between 2 computers, but be fast enough to to edit DV as if I'm editing on footage stored on my computer. As of right now I have 2 hard rives in my machine, one for OS/software (premiere pro) one for the footage I work on. I have a third firewire hard drive for storage. We also have a network attached 1TB server for storage after were done editing. There is one other editing machine similar setup. This works fairly well so far, but as we get more and more work I find we have to share more video together and we use more and more old footage that we store on our server, which I find to be just too slow and too much of a hassle So what I'm lookin for is basically a faster version of of our 1TB server. Ive looked around and there seems to be products that fit my needs but are for avid systems or final cut pro, or r just way too expensive. What are the options for the connection of network server? It seems standard ethernet is too slow, is gigabit ethernet much faster for this? Are there firewire solutions? I stumbled upon one where it seemed to be mutlitple channels of sata cables requiring a special pci express card, but I cant remember what company thats was from. Id really appreciate if anyone had any info on this type of product, what they use themselves, any DIY ideas or even new products they hear are coming out in the future. |
September 14th, 2007, 02:21 PM | #2 | |
Major Player
Join Date: Jan 2007
Location: Portland, OR
Posts: 949
|
Quote:
Gigabit ethernet has a theoretical maximum throughput of 125 MB/s, but after accounting for overhead and the difference between theory and practice, you can't count on getting more than 80 MB/s. Sadly, even this paltry rate isn't attained by typical fileservers. Most I've seen struggle to get even just 30 MB/s, perhaps due to the protocol (SMB, usually) or implementation of the protocol (Samba). The net result is that NAS should not be used for performance-critical applications. Instead, narrow your search to locally-attached storage solutions. |
|
September 14th, 2007, 03:10 PM | #3 |
Inner Circle
Join Date: Feb 2007
Location: Tucson AZ
Posts: 2,211
|
Shared high end storage devices on Fibre Channel are pretty quick. But be prepared to spend big bucks. Very, ver ybig bucks.
For sharing, the most cost effective would be to store stuff on a shared server on a gigibit ethernet network, and stage the material to your workstation in preparation for using it - ie "check it out" from the central repository the night before you want to edit it or a couple of hours ahead of time, copy it to your workstation, send it back to the server when you're done with it. |
September 17th, 2007, 07:24 AM | #4 |
New Boot
Join Date: Mar 2005
Location: London, Ontario Canada
Posts: 22
|
thanks for the help guys. It turns out that our server is gigabit ethernet, so I guess I'm just really impatient, Ill have a look at fibre channel to see how insane the prices are. I'm still trying to find this one company I found earlier that was offering a some sort of SATA pci card theat used four cables connected to the hard drive. I'll post it if I can find it.
|
September 17th, 2007, 11:59 AM | #5 | |
Major Player
Join Date: Jan 2007
Location: Portland, OR
Posts: 949
|
I second Jim's idea about checking out shared files.
Quote:
Furthermore, Fibre Channel is not very cost-effective for throughput-limited disk I/O (i.e. video). One may attain 6.4 Gbps sustained reads on cheap 8-port SATA arrays for around $3,000. Similar performance from Fibre Channel would require an 8GFC system which are very, very rare, and cost well over $10,000. They are well oriented towards latency-limited disk I/O, situations that require storage networking, or $100,000+ budgets. There are a lot of vendors with four port eSATA raid cards: Areca, Norco, Sonnet, and I'm sure many others. There are an even greater number of four-disk enclosures with eSATA ports. However, they are all local storage systems, not networked. |
|
September 17th, 2007, 12:43 PM | #6 |
Inner Circle
Join Date: Feb 2007
Location: Tucson AZ
Posts: 2,211
|
The idea of checking stuff out of a shared storage device is not new, but it does work OK and it probably is the cheapest way to go, depending on the degree of sharing you need. Haven't really tried to configure one, but if the number of users is low enough simple manual controls are probably adequate.
Shared storage over a Fibre Channel SAN is quite common in high end systems, but it is absolutely not cheap, or something for the smaller outfit. You do of course, as you point out, require a system that can handle resource serializaion and locking. Maybe $100k would be on the low side:<) One of the network IT execs presented his content archiving system at Storage Networking World last year. Can't recall the details, but if I find a reference to the presentation I'll post it. |
September 17th, 2007, 01:12 PM | #7 |
New Boot
Join Date: Mar 2005
Location: London, Ontario Canada
Posts: 22
|
"Areca, Norco, Sonnet"
Can any of these can be shared between 2 PCs? by the way thanks for those names, that is exactly the stuff I was thinkin' of. |
September 17th, 2007, 01:33 PM | #8 | |
Major Player
Join Date: Jan 2007
Location: Portland, OR
Posts: 949
|
Quote:
Quick recap for you: NAS: easy, cheap, slow, shared. Local storage: easy, cheap, fast, not shared. Shared storage: hard, expensive, fast, shared. In any case, here are some example storage arrays that could be used with a shared filesystem: PROMISE VTRAKJ300SROHS Promise VTrak 12110 HP 302970-B21 HP 335880-B21 |
|
September 17th, 2007, 02:10 PM | #9 |
New Boot
Join Date: Mar 2005
Location: London, Ontario Canada
Posts: 22
|
Well I guess I "may" be a little inexperienced, but the PROMISE VTRAKJ300SROHS option you posted is pretty much what I was asking about. I'll just have to be more articulate next time.
|
September 17th, 2007, 02:11 PM | #10 |
Tourist
Join Date: Mar 2007
Location: London , Ontario
Posts: 2
|
Why hire spend the money to hire somebody when we can do the research and implementation ourselves and gain all the knowledge about the process. The reason our company is successful is that we are group of people that want to problem solve rather then to just "Hire Someone" because we are scared of being "Under qualified"
|
September 17th, 2007, 03:14 PM | #11 | ||
Major Player
Join Date: Jan 2007
Location: Portland, OR
Posts: 949
|
Quote:
Quote:
Similarly, I'm not sure that setting up shared filesystems is core to your business. Are you going to monetize that experience by creating a new profit center? A side-business where you install distributed file systems? If not, then it's just a question of expense: what is cheaper in the long run? How many hours will you spend learning how to do it yourself? And how much is your time worth? What if a professional can set it up for $800 but it takes you 80 hours to learn? How many times will you be utilizing the experience to cut costs? Consider additional learning for rapidly-changing systems. Another complicating factor is the sheer cost of the equipment. You can't learn how to setup a shared filesystem unless you actually have the hardware in your hands; so you have to buy first and learn later. What if, due to your inexperience, you buy $4,000 in equipment more than what is necessary? Or something that wont be compatible with your final solution? What if you don't need shared storage at all, and multiple local storage systems with change management software would be cheaper and more performant? That's partly why I recommended a training session where they give access to actual equipment. (I first learned shared file systems at a two-day session six years ago.) Despite all the cautionary comments, you have the most important thing necessary for do-it-yourself, and that's drive (or is it frugality?). I say go for it. |
||
| ||||||
|
|