Jump to content
Sign in to follow this  
jblessing

Goodbye Mac Pro, Hello Octane PC

Recommended Posts

The time has finally come to leave our mac pros behind. We want to expand our use of Octane in C4D, so we have been looking at 4 GPU PC options. Does anyone here have experience with these?

It looks like Boxx and a couple others have some options, but a custom built system seems to be the best value right now. How does a workstation based on these parts sound?

CASE/MOTHERBOARD/PSU
SUPERMICRO SYS-7047GR-TRF
(4U Rackmountable / Tower Server Barebone Dual LGA 2011 Intel C602 DDR3 1600/1333/1066/800)

Dual 10 core CPU
Intel Xeon E5-2690 v2 Ivy Bridge-EP 3.0GHz LGA 2011 130W Server Processor BX80635E52690V2

4 x GTX Titan Video Cards
EVGA 06G-P4-3793-KR GeForce GTX TITAN BLACK Superclocked 6GB

32GB RAM

SSD Cache Drive

SSD System Drive

Project Drive 1TB
(Western Digital WD VelociRaptor 10k HD)

 

Windows 7

 

(existing external RAID)

 

If I could find some 6GB 780ti cards, I would go with those over the Titans, but I can't seem to find them anywhere. We may do some GPU render nodes based on a stripped down version of this too (basically 1 HD, and a lot cheaper CPU).

 

Is Windows 7 still the best choice?

Share this post


Link to post
Share on other sites

Hey John -- Couple notes on my end and feel free to stop by yourself or with your team and check out what we are running here. Email or call with any questions.

 

CPU/MB -- I would skip the dual socket mother board and xeons and go for a top of the line i7 (the extreme version even) and then a pro series or workstation rated motherboard.

-- The reason for this is that those CPU's run about 2K each and for 4K you can set yourself up with at stupid fast workstation that's overclocked to 4.5GHz. The Xeons are nice but way too costly IMO. Same think for the MB, at 1K you can build out a nice case of your own with great parts and save a ton.

 

GPU -- If you have the cash for the titans then spend it but I would personally say to run 2 or maybe 3 high end gaming cards. Stacking cards does not seem to double the Octane speeed but instead only run a bit faster per card added. I have some friends that sold their titans to double down on 780's. You can get them with 4 gig ram for cheap (relatively)

 

RAM -- Way more that that. If you go with a standard or high end board, max it out at 64 or 128 GB since you can simply use more programs at once.

 

SSD -- boot drive buy something super reliable like a samsung pro. and then the cache drive who cares just a cheap SSD

 

Project drive -- Ideally get 7200 drives at like 2-3 TB and raid 2 or 3 of them together. much faster than that 10K drive alone.

 

Windows 7 is my jam for now but thats because I never want to upgrade early.

 

My basic stance and reasoning behind all this that what you want is a high end responsive workstation to get work done fast and then a beast of a renderer as well. These things are not always something that is best when combined into one machine. Our studio opted for the most responsive workstations we can afford and then threw the rest of the money at a PC render farm to support that side of things. The nodes we built are just a MB with on board video, a cheap overclocked processor (i7) and a small HD. They come in at about 700-800 with windows and all. if you added a 400 video card for octane, you are still only looking at about $1200 max for something that clocks in near to the old 12 MP tower.

 

Our thoughts we to have a render farm that processes and a workstation that builds. Then research the CPUs you are looking at and see what adds up to the best cinebench score.

 

Looks like one of these machines would be roughly 14K and say you were going to build 3 of them in the studio for a total of 52K. You could instead build out 3 beast workstations at 6-7K and have 31K for render nodes. At even 1.5K per node for the farm, you could have almost 21 8core machines for that cash and the thing would destroy your projects.

 

 

 

 

 

 

 

 

Share this post


Link to post
Share on other sites

 

 

Is Windows 7 still the best choice?

Windows 8 is solid. We have been on it for quite a while now, probably more than a year across all workstations.

 

Once you install Win 8, you will need to google a web page about how to disable all of the RT / tablet oriented stuff. The new start screen will be there, and I promise you will learn to love it - but you must disable all of those picture viewers, chat applications, mail clients, etc etc or it will drive you mad.

 

Rumors have it that the new CEO of MS is trying to get a more desktop oriented version of windows 8 out the door asap.

 

 

 

we have been looking at 4 GPU PC options. Does anyone here have experience with these?

 

I haven't toyed with Octane much, but I would say from the GPU computing experiences I do have, be sure to look up memory to cores requirements. Say you have 4, 6gb cards. While you may be able to use all cores across all cards, you will only be able to utilize the vRAM of a single card. Maybe this has changed or is different for octane, but its still the situation in Turbulence 4D, which I have been working with over the past week.

 

 

 

If I could find some 6GB 780ti cards, I would go with those over the Titans, but I can't seem to find them anywhere.

 

I don't think there are 6gb TIs, but you could get a 780 with 6.

http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=100007709%20600440544%20600480021%20600358543&IsNodeId=1&name=6GB

 

When I upgraded from a 680 to a Titan there was only a 12-14% increase. For twice the price, its not really worth it. Get the 780s. But yes - 6gb vRAM is critical.

 

 

 

 

Project Drive 1TB

(Western Digital WD VelociRaptor 10k HD)

 

I don't know what you are planning to put on your "project drive" but surely one drive won't cut it. The main problem with the board you have selected is that there are only 2 SATA III ports, which will be used by your system and cache SSDs. So... if you are storing media on it, it will feel relatively slow since its only getting a 3 gb/s pipe. If you are storing project files, its not secure because if the drive fails you lost everything.

 

You could either set this up as a RAID 1 for security, or RAID 10 for security and speed. That case has room for it. For a RAID 10, I'd recommend using all 8 bays available for the raid 10. Its okay to leave the small SSDs hanging loose but tucked away in some nook within the case.

 

 

 

 

CASE/MOTHERBOARD/PSU

SUPERMICRO SYS-7047GR-TRF

 

You could save money and probably get better specs if you shop for a motherboard, PSU, and case separately. Its a nice case, but since it supports xeon e5v2 maybe you can find one that has thunderbolt (not completely sure about that, though), and more sata III ports.

 

Also, since you are using 4 GPUs, you will be running low on expansion ports. See if there are other options that give you more room. If you end up going i7 there are far better boards that provide room for expansion.

 

 

 

Something to consider is that for HUGE jobs, you can use an Amazon GPU farm for cheap when needed, and save some coin on all those GPUs.

 

 

I'll echo Edrine's thoughts on the differentiation between a workstation and a render station. An overclocked i7 actually out performs a dual xeon setup in most in-app situations. Dual xeons are good if you do a lot of 3d rendering, but considering you seem to be going all out on GPU, I feel its a bit pointless.

Share this post


Link to post
Share on other sites

from what ive been looking. i think getting titans/ 780's is overkill right now.

 

id say get some used EVGA 580/590's. EVGA gives life time warranty on them. and you get MUCH better bang for your buck.

 

I'd wait for the 8 series before investing in new GPU's.

 

at the very least fill up your render nodes with 580/590s. you get same performance at like 30% of the price. maybe even buy some extra's to pop them in if any fail.

 

I really want to buy a new system right now. But just don't want to for such marginal increases in rendering power.

Share this post


Link to post
Share on other sites

Thanks for the input guys. We need to have these built and ready to go by Setptember, so I don't have time to wait on new hardware that may be coming out. We may go 6GB 780s in order to get another GPU render node...we'll see how the numbers all work out. An extra $2k per workstation for 1500 more CUDA cores with Titans is a bit steep.

 

I have been spec'ing an i7 option as well, but I'm concerned that it may not have enough CPU power for AE rendering. We can network render anything in C4D (CPU or GPU), but we don't network render AE. I also like the idea of having some stuff render on the GPU (Octane-net) and something else rendering on the CPU (AE/Team Render) overnight...

 

This is the latest i7 spec using a vender that does liquid cooling for everything:


MAINGEAR FORCE X79 (system-EPIC-FORCE-x79)

Chassis: Corsair® Obsidion 900D with Window
Exterior Finish: Brushed Black Aluminum with Acrylic and Matte Black Accents
Motherboard: Asus® Rampage IV Black Supporting USB 3.0, SATA 6G, 802.11ac Wireless [EK Watercooled]
Processor: Intel® Core™ i7 4960X Six-core 3.6GHz/4.0GHz Turbo 15MB L3 Cache w/ HyperThreading
Processor Cooling: EK Supremacy - Nickel
Tubing: Primochill Advance LRT Flexible PVC Tubing
Heat Exchanger Array: EK 420 (3x140mm) copper core radiator with Corsair® high airflow fans
BiTurbo Pump Array: BiTurbo Dual Laing D5 Vario Pumps
Reservoir: 400mL Bitspower Oversized Reservoir
Coolant: EKoolant extra pure, distilled, and deionized water
Memory: 64GB Corsair® Dominator™ Platinum DDR3-2133 1.65V (8x8GB)
Graphics and GPGPU Accelerator: 4x NVIDIA® GeForce™ GTX TITAN Black 24GB Total GDDR5 In SLI [EK Watercooled]
Power Supply: 1600 Watt LEPA G Series G1600-MA
Hard Drive Bay One: 250GB Samsung® 840 EVO SSD (w/TRIM) [540MB/s Sequential Reads]
Hard Drive Bay Two: 1TB Western Digital VelociRaptor SATA 6G 10,000rpm 64MB Cache
Hard Drive Bay Three: 120GB Corsair® Force GT SSD SATA 6G (w/ TRIM) [555MB/s Reads]
Hard Drive Bay Four: Pre-Wired SATA Backplane Expansion Bracket For Easy Upgrades
Optical Drive One: 24X Dual Layer DVD RW Drive
Operating System: Microsoft Windows 8.1 Pro 64-bit
Angelic Service Warranty: Lifetime Angelic Service Labor and Phone Support with 2 Year Comprehensive Warranty

 

Product Subtotal: $12,171.50

 

So their full liquid cooling and warranty add $4k to our custom build with basically the same exact parts (except using a Corsair 760T case, Cooler Master M2 1500w psu, and Corsair H110 cpu liquid cooler). So is that worth the extra peace of mind, OC speed, and a quieter build? Maybe, maybe not...

 

For the same price we could build a 4960x workstation (with 4 780s) and a 4930k render node (with 4 780s). So similar total CPU power, but with almost 18,500 cuda cores total. So that seems to make more sense, but how much do we trust the network rendering? If anything doesn't work with that, I'm stuck with about the same CPU power as my 2010 mac pro and only 9000 cuda cores. With our current schedule, I don't have time to test anything before buying...hopefully the forums can be trusted :)

 

I've also been trying to find a good i7 mobo that will handle the 4 GPUs and still have PCI-e slots to spare. Most seem to have only have 4, or the others that do have more get buried below the GPUs when they are installed. I really just need one more for a RAID related card.

 

The ASUS Rampage IV Black, ASRock Extreme11, ASUS P9x79 WS, and Supermicro 7047GR-TRF seem to be the best boards so far. They each have their pros and cons.

Share this post


Link to post
Share on other sites

 

I've also been trying to find a good i7 mobo that will handle the 4 GPUs and still have PCI-e slots to spare. Most seem to have only have 4, or the others that do have more get buried below the GPUs when they are installed. I really just need one more for a RAID related card.

The ASUS Rampage IV Black, ASRock Extreme11, ASUS P9x79 WS, and Supermicro 7047GR-TRF seem to be the best boards so far. They each have their pros and cons.

 

I have the Rampage IVs in our primary workstations. Regarding your need for a RAID card: there is on-board RAID support. So if you are connecting drives via SATA to the board, you can raid them without an extra card.

 

 

 

I have been spec'ing an i7 option as well, but I'm concerned that it may not have enough CPU power for AE rendering.

 

AE does a pretty horrible job at utilizing multiple cores anyway. More cores would help a bit, but not nearly as much as you probably think.

 

 

 

Product Subtotal: $12,171.50

 

Insanity. Build it yourself. Its not hard.

 

 

 

 

I don't know what you have lined up that needs all this GPU horsepower, but it better pay for the investment fast. I have titans and all, but damn... What you are describing is far out. 16 titans!!??!!?? You realize premium hardware like this is on a 6-12 month cycle, right? Just sayin.. This ~$50k investment you are about to make will feel old in 2 years.

 

Have you looked into using Amazon GPU instances to do this farm stuff? You could save tens of thousands without carrying any concern of holding aging hardware.

 

Seriously.... get some really smokin workstations for about 3-6k each if you want all bells and whistles, and keep a totally separate cluster of machines (I recommend the cloud solution), for your rendering. Costs are kept down, your rendering hardware is kept up to date, and rendering never ties up your workstations. Plus, since you would have to use Deadline to manage your amazon renders, that means you would then also have the ability to do multi-machine AE renders via command line in the background across local workstations.

Share this post


Link to post
Share on other sites

I would have to with AromaKat -- You can go down to Microcenter in Denver and buy the parts yourself and make two awesome machines for the 12K sticker tag... or 1 awesome machine and like 9 render nodes. Unless you really have this Octance already locked down and running at full speed, I personally think its a waste of money. A good friend recently completed an amazing Octane animation with two cards in one computer. If you dont have a specific need for these GPU's, and in my mind its really for large scale simulation rather than rendering, you can get away with even 600 series 4GB GTX cards. You save money and then use that to get new machines in a year or two instead of 4-5 years.

 

I use a very similar i7 processor (just last gen and overclocked to 4.5GHz) and it can handle anything AE can through at it. I have also actually found that the best way to work is to use background renderer pro and set 1 core to render at max and I can background render up to 4 or 5 comps at the same time while working and its all fairly heavy broadcast work.

 

Swing in next week and take a look.

 

Also remember too that adding more cards for GPU is to my knowledge less effecient in render speed that having the cards spaced out in different machines. Most programs are limited by the amount of VRAM on the cards but not the CUDA cores so you may end up with 10 gagillion cores opperaing on 6 gig of ram

Share this post


Link to post
Share on other sites

Yeah, I think I read somewhere that adding 2 cards does not mean 2x speed, the additional cards does not necessarily mean the equivalent speed increases.

 

There was a thread back a bit with AE speed tests. Not too many results in there but the higher ghz was faster than more cores. C4D playback relies more on single thread at higher ghz also. The only situation that benefits from a ton of threads is a C4D render, AE can't use that, too randomly flaky and slow, nobody knows when to use MP.

 

And about doing an Octane render and a CPU render at the same time, I wouldn't count on that. I've found the machine still gets pretty taxed from the GPU render.

 

Nerd on.

Share this post


Link to post
Share on other sites

 

I have the Rampage IVs in our primary workstations. Regarding your need for a RAID card: there is on-board RAID support. So if you are connecting drives via SATA to the board, you can raid them without an extra card.

 

Seriously.... get some really smokin workstations for about 3-6k each if you want all bells and whistles, and keep a totally separate cluster of machines (I recommend the cloud solution), for your rendering. Costs are kept down, your rendering hardware is kept up to date, and rendering never ties up your workstations. Plus, since you would have to use Deadline to manage your amazon renders, that means you would then also have the ability to do multi-machine AE renders via command line in the background across local workstations.

How have the Rampage IVs worked out for you guys? Have you had any problems with it's built in RAID controller?

 

The 4930k's look great for the render nodes. The 4960x's look great for the workstations (since they can be significantly and safely OCed).

 

Nice tip about the multiple AE renders using one core each...thanks Ed!

 

Stuff like this is why it is worth the trouble to get 4 GTX cards + Octane running:

interactive speed:

https://www.youtube.com/watch?v=DiMdJjjlr-A

 

2 Titans = 5-15min / frame render:

 

Octane's speed does scale linearly with the number of CUDA cores it gets. The biggest slow down comes from having to transfer the scene/textures to the GTX cards. So having few computers with more GTX cards per computer will be faster. Other renderers will vary on limitations, but Octane seems to be the strongest one for now that works with C4D. We have a project coming that would keep our current CPU render farm busy for 6 months straight, so hopefully going for GPU rendering with Octane should dramatically cut that down while adding the ability to do true GI with better test rendering capabilities.

Share this post


Link to post
Share on other sites
How have the Rampage IVs worked out for you guys? Have you had any problems with it's built in RAID controller?

 

In general, no problems whatsoever. I did have an issue once but it was my own stupid fault: I had a raid 5 going at one point (never do raid 5 - they always screw me in the end), and replaced the bios battery during my yearly upgrades, thinking I was being proactive. When I did, the raid 5 config, along with all my bios settings, was lost. It was such a dumb move on my part. Of course that happened. So... beware of that, if you didn't figure so already.

If you are doing raid 1 it should be fine. Back up EVERYTHING before you remove that bios battery. Your HD data, your bios config, etc etc. Its a large watch battery looking thing on the motherboard. Also, label everything maticulously when building. Hard drives, what sata port each HD is being plugged into, etc. Keep detailed logs of that stuff. Hardware IDs, etc. If you know exactly the configuration the drives were in, there is software available to figure out the raid you had and reconstruct the data if there is a loss like that.

It sounds scary, and like its a problem, but the truth of the matter is that all raid solutions have similar weaknesses. In general, though - no issues using on board raid aside from eating up your sata ports if thats important.

The 4930k's look great for the render nodes. The 4960x's look great for the workstations (since they can be significantly and safely OCed).

 

 

Again, you can save money by avoiding the 4960. Its a bit of a splurge with a very nominal benefit. The 4930s can OC just as well. I have the previous gen (3930k) OCed to 4.7 as standard. Make sure you get some really good water cooling though. I have the H100 on mine, with 2x fans through a radiator and the air coming out is still hot.

 

Find someone on youtube with the same Mobo / CPU combo that you end up getting and copy their bios OC settings. Its what I did, and with just a couple of trial and error settings got it stable.

 

The Rampage IV is built for OCing, with all of the settings you could want within the bios.

 

 

* Regarding the 4x GPUs: I understand now - and having local GPUs makes sense. Just.. find good deals on 690s, 790s, etc on ebay to start. Start with 2 per and see if you need another 2. The GPU world is the fastest changing area in computers right now, and I'd hate to see you flush so much money down the toilet on this quarter's $1000 fad card for a 5% increase in performance over a card you can get off ebay for under $500.

Edited by AromaKat

Share this post


Link to post
Share on other sites

Hey so just to update this thread, we got the first render node built and are waiting to do the workstations until early 2015. This node is a beast! Thanks for all the help.

 

The latest intels came out the week before we were going to order, so we ended up going with the 5930k processor and EVGA Classified motherboard (purely because of availability...the ASUS x99-E WS board is what we really wanted). Just using the XMP profile in the ram and the H100i cooler, I got a stable OC of 4.1ghz (w/o changing voltage). Cinebench 15 gets around 1200 on this, so it's almost to the 1350ish of our 2010 Mac Pro. I know I could go higher, but I don't need a lot of CPU power now, and I want to save power for the 4 GTX Titan Blacks (Asus brand). With the Corsair 730t case, and a couple Noctura fans, full load on all GPUs is about as loud as our external HD Pro RAID for the Mac Pro. Windows 8.1 seems good...I don't know what people are complaining about.

 

Of course what matters is how Octane performs...which is amazing. It's not perfect yet, but it is workable and so fast. For the house interiors/exteriors and occasional logo it is near perfect. More complex scenes/textures that won't fit in 6GB of ram still have to use another renderer.

 

The Titans were a bit overkill, but the project this is all mainly for lasts until May and they seem to be the best option until then. The 780s looked good too, but none of the 6GB versions come with the reference cooler (that ACX cooler doesn't look like it would work well stacked 4 high and I don't want a slower card getting temperature throttled too).

 

As for the workstations to come, they will be this build but with the 5960x CPU, ASUS x99-e WS board, and 64GB ram. The main advantage of that board is the true 16x PCI-E lanes for all the Titans. The other advantage of that x99-e WS board is it accepts Xeon CPUs too, so if we go back to more CPU rendering, we can just drop in a single E5-2697 v3 or something. The workstations will also get some SSDs and an internal RAID 10.

 

The build went smooth too...except for the bad PSU (Corsair 1500i). After we figured out it was the PSU, and not the motherboard or ram, it went fine.

Share this post


Link to post
Share on other sites

Thanks for the update. Sounds like an awesomely ridiculous setup.

 

Please post some octane results, if you can. My biggest fear with octane is versatility - ie: not being able to update something in a particular fashion that the client requested, requiring us to render things in a different way, blowing deadlines and budget out of the water.

Share this post


Link to post
Share on other sites

Gonna hijack this thread.

I'm looking at upgrading/moving my fileserver from a Node304 case to a Node804 case (slightly bigger, 2 more drives), so I'll have the skeleton of another system. For an octane renderer

 

A- Does anyone know if the Node304 case will fit 2 GPUs?

B- Is a dual core celeron plenty power for a Octane render node

C- 2x750ti decent or should I just go for 1 780ti and hope to upgrade to a second later?

 

I'll have the Node 304 case, 1x4GB DDR3 RAM, Intel Celeron G1630 2.8Ghz CPU, and 3 or 4 1TB drives.

FIgured I could pickup a decent motherboard, a 120GB SSD, and then a GPU and I'd be good.

 

Thoughts?

Edited by tjoynt

Share this post


Link to post
Share on other sites

I'll have the Node 304 case, 1x4GB DDR3 RAM, Intel Celeron G1630 2.8Ghz CPU, and 3 or 4 1TB drives.

FIgured I could pickup a decent motherboard, a 120GB SSD, and then a GPU and I'd be good.

 

Thoughts?

 

I assume you're planning on using the SSD as your boot drive? I'd get a bigger drive. I did the same thing with windows 8.1 and it took something obscene like 50 or 60 gigs. I read all around about it saving backup data to the drive and to disable it, but I could never find out where those files were saved and no matter how many tuts I followed, I still felt like I was lacking space. Disclaimer: I hadn't used windows for 10 years up until earlier this year, so someone with some more know how feel free to jump in.

Share this post


Link to post
Share on other sites

how often do you boot up windows? do you really need it on an ssd?

 

i mean, once its in your ram, your ssd has zero purpose.

 

i'd use for keeping footage or something that needs to be constantly updated or loaded :D

 

my computer boots up once a day. and then it all sits in ram...

 

maybe if you use something like 3dsmax that take a year to load, maybe it'll load faster. I heard games benefit greatly from being on an SSD since it makes them load a lot faster. But i think in general putting windows on an SSD if your computer is on most of the day is a waste of time.

 

 

re: node 804 case: google image search: https://www.google.hu/search?q=Node804&client=firefox-a&hs=7Jc&rls=org.mozilla:en-US:official&channel=fflb&source=lnms&tbm=isch&sa=X&ei=t6RHVIedPPHW7QbbyYGYCA&ved=0CAgQ_AUoAQ&biw=1597&bih=875#facrc=_&imgdii=_&imgrc=g-pBa-GrOxcZPM%253A%3BOWV1rUtzM581JM%3Bhttp%253A%252F%252Fbenchmarkreviews.com%252Fwp-content%252Fuploads%252F2014%252F05%252FNode804_270Xs.jpg%3Bhttp%253A%252F%252Fbenchmarkreviews.com%252F15508%252Ffractal-design-node-804-micro-atx-case-review%252F%3B600%3B485

Share this post


Link to post
Share on other sites

how often do you boot up windows? do you really need it on an ssd?

 

i mean, once its in your ram, your ssd has zero purpose.

 

 

 

Having a SSD as a boot drive had numerous advantages.

It's not only faster booting up, also every application will start faster.

Your system will feel snappier overall. And don't forget all the

unnoticeable temporary caching which is happening in the background –

will also make your system faster. And last but not least,

no spinning noise :)

Edited by tezuka

Share this post


Link to post
Share on other sites

Yeah, I see no reason NOT to use an SSD. All my systems (mac and pc) are on SSD OS drives now, and they all feel snappier.

It's just for a render node, so it's not really going to have much on it. 120GB should be more than enough.

 

@vozzz: thats the Node804, not 304. From what I've found, it seems the 304 won't work with 2 GPUs, so this whole thing is kinda moot :D

Share this post


Link to post
Share on other sites

I'm curious what gfx cards are most of you running on mac & pc??
There's been a big thread on the octane forums about how the newest nvidia cards are not performing any better than the previous generation.

I have an aging mac pro w a solitary GTX680 and
a Windows 8 rig w an EVGA GeForce GTX760.

I would like to ideally add one more card to both machines any recommendations that are not $1200 titan cards?

Share this post


Link to post
Share on other sites

The 680 and 760s are still good cards. I'd just double up on them if I were you.

 

I upgraded from a 680 to a Titan, and only saw a 12% speed increase in TFD GPU sims. The increase in vram was worth it for me at the time, because there were no other 6gb offerings.

Share this post


Link to post
Share on other sites

(for context, I'm on a 2009 Mac Pro 2.93 8 core and 32 gigs of RAM)

 

I'm running a GTX 780 and I am pretty pleased with it. I don't do any of the crazy shit you guys are on about with Octane but Resolve, Media Encoder, and Premiere all run great, and I did notice a big difference in Ae on the ONE project that I have ever used the ray traced renderer on.

 

Also, speaking to the SSD thing. I picked up a couple of pieces of kit that might be of interest.

 

1: http://www.caldigit.com/Fasta-6GU3pro/

Doesn't require extra power like other USB3 cards, and has 2 internal SATA3 ports so you can hook up your SSD in the optical bay and route the cables and voila. Im getting something like 450/MB/s read/write on a Samsung 840 Pro. Its not maxing out the drive, but its 2x as fast and SATAII. My windows install lives on this drive.

 

2:http://eshop.macsales.com/shop/SSD/PCIe/OWC/Mercury_Accelsior/RAID

My OS X install lives here, and this thing is rad. over 600 MB/s read/write. A little pricey, but man, does it make everything feel faster.

Share this post


Link to post
Share on other sites

Thanks AK, I was thinking of just doing that, being the new gen of cards is apparently not any better for Octane.
Good to know about the +12% w the Titan.

The 680 and 760s are still good cards. I'd just double up on them if I were you.

 

I upgraded from a 680 to a Titan, and only saw a 12% speed increase in TFD GPU sims. The increase in vram was worth it for me at the time, because there were no other 6gb offerings.

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Sign in to follow this  

×
×
  • Create New...