Jump to content


  • Content Count

  • Joined

  • Last visited

  1. Hey guys, we definitely explored this concept and here are some things we considered. Element 3D uses OpenGL for rendering so It get's its speed from rendering the entire frame at once, on a single GPU. On the other hand, CPU ray-tracers and CUDA renders like Octane focus on rendering individual pixels or rays. So these points can be rendered anywhere at any speed and then combined for the final image. So for an individual artist using Element 3D, having a powerful GPU is ideal but I understand that once you crank up the settings super high it can slow down. With that said, network rendering is possible with Element 3D but it requires a render farm with adequate GPU's and since most render farms don't have great GPU's, we decided to focus our resources on local GPU rendering performance. It is also ideal that all render farm GPU's are from the same maker. If you want to try it out, I'm sure we could arrange some discounted licenses to be used on a farm... But, If I'm completely honest, I think that once you get to the point where you NEED a render farm for Element 3D, then you start crossing that software threshold where your project might benefit from using C4D which can be sent to a render farm. I believe in Element 3D 100% and I think that render speed is a huge advantage for a lot of projects, not to mention using it for look development with instant feedback but I also use 3D max and Vray for projects when I need to. Best!
  2. Just wanted to let you guys know, we just released Element 3D V1.6 with several new features for better compositing and integration. Upgrade is free! Here is the link to New Features notes. Watch New Feature Tutorial
  3. Hey! We just released a customer beta for Element 3D V 1.5.5 that has some new features and bug fixes. As posted above these are new features in addition to those in Element V1.5 seen here: http://www.videocopilot.net/help/element/tutorial/basic/element_1_5_new_features/ Couple of highlights for V1.5.5: -File Re-linking system -Find 3D position Utility: Create Null at any point of a 3D model. -Generate Group Null Objects. Download is here along with full feature notes and bug fixes: http://www.videocopilot.net/forum/viewtopic.php?f=42&t=105378
  4. Aaron, what graphics card do you have? I'll check out the motionblur crash too, could be a bug. We are releasing a pretty big maintenance update next week and I'd love to include any of these fixes as well.
  5. Binky, we also added a new polygon resolution called "Extreme" that doubles the resolution of Ultra. This should fix that problem for ya! And just to clarify, Aliasing has more to do with the image smoothing and your example is more about polygon resolution. But we improved both!
  6. Yes Aaron has it right, you just need to duplicate the layer for the additional comp/camera. This actually works very well, some of our Beta testers used it with no problems. As for rendering 2k 4k comps it all depends on your graphics card memory. I can do 4K on my geforece 580 with 2GBs but it's slow. Doing 2K anamorphic is decent so for film work and FX there is not much problem. Also big news is we are getting ready to release a new update that offers great Anti-Aliasing options without additional memory useage. It does a special type of super sampling that you can combine with Multi-Sampling (hardware aliasing) to create crisp edges and hilights with a reasonable hit to render speed. Plus you can just turn it on at render time etc. There is 2X, 4X and 8X options available, use as much as you need. Here is an example: The middle image is the standard quality in an extreme case, that is really small reflective polygons. http://t.co/dHYLVOp9 Update dropping next week with a bunch of bug fixes and tweaks.
  7. Yo! Thanks for the reply. Yes you should be able to blend 2 shaders but we have found a bug on some ATI cards with the animation engine that we are working to resolved by next week. We're working really hard to fix compatibility issues and have an update soon. The plug-in also will show you a little link in the UI when there is a new update.
  8. Hey Chris, The multi-object system was designed just for this. For a title, you would need to separate the charachters into individual objects first. And then export as one mesh. Our text extrusion system does this automatically, so you can use the scatter and displace controls to animate each pieces procedurally. There is also random seed controls etc. Think of it like the watch example it just allows you to animate them in or out. Here is an explaination of the 2 thing Element converts: Materials: Any surface with a differn't material will create a material channel in Element. This is useful for being able to "hide" parts of a single mode or when you need to apply variable textures to an object like metal or glass and the shader controls require major differnces. Multi-Object: Any object that is a separate mesh in the same OBJ (or C4D). Any time you want to use the multi-object to animate groups of objects at once. These two functions are not dependent on each other but can be used together. The multi-object system does not recognize objects based on separate material channels.
  9. Right. By default, element normalizes the model sizes to keep objects at a relative size when used in the particle array. However you can disable the normalization and with a little work, find the multiplier so at least you could reuse the scale and just find the position from a NULL. And if you keep your measurements the same, this scale value can be reused. I'm happy to help find the conversion after we launch too.
  10. Yes, you can use multiple instances of Element and the texture resources will actually be shared. The drawback is the rendered frames have to be transferred from the GPU back to After Effects for each instance. This process isactually slower than the actual rendering but in order to access the render we need to send it back to After Effects. When you play a video game, the GPU is outputing the frames directly to your monitor directly, so the graphics don't go back to the CPU. This way the latency is instantaneous. But to make the image available to After Effects we have to transfer it back to you and so multiple instances of Element will seem slower than rendering everything inside a single instance because the transfer only takes place once. Element has 5 individual groups which are designed to be used for separate objects to avoid multiple instances when possible. Mylenium makes a great point, about finding the sweet spot and once you pass that threshold of speed VS function then decide what the best approach to take. Best!
  11. Thanks for checking this out guys, here are some additional follow up notes. Polygon Limit: Element doesn't have a polygon limit but your GPU dictates how much you can do which is pretty decent. The liquid models in the demo are around 100k polygons each and you they don't slow things down much. This motorcylce renders in .5 seconds at Full 1080p and that is with Ambient Occlusion on. View Image User Interface: Yes, the UI looks a bit crammed at my tutorial resolution BUT, it is fully customizeable and works at full screen. Here is a screenshot at 1920X1080 you can see all the parameters pretty clearly. http://goo.gl/i8ecf You can also move the Model Browser next to the materials and make the preview really big too. Render Farm: We don't reccomend using it on a render farm because the GPU variations could cause inconsistencies and most farms don't have GPU's usually. But if you need to send to a farm, we suggest pre-rendering the Element Layers and then sending the project off. Things not supported in V1: Ah-ha!! - Only Static 3D objects can be imported but we have serveral interesting animation capabilities that I will discuss soon. - Lights do not cast shadows but we do have a fast ambient occlusion system for object shading. We even have a matte shadow material type for doing object occlusion. Example Here. Thanks again, I'll try to answer any other questions when I can, but right now I have to make about a million tutorials for the product page and help section!
  12. Yo, Apologies if this is not the right forum to post this. We're currently looking for a few additional beta testers to help with an upcoming AE Plug-in called Element. Specifically people who use Cinema 4D in their creative pipeline and are interested in some unique integration. We want to make sure that we get input from actual C4D users to help improve the workflow. If interested, please email system specs including graphics card to beta at videocopilot.net Thanks!
  13. Well, after doing a lot of research I'd like to share some thoughts... The Quadros are mainly used for machine certification on specialized systems and supposedly offer some piece of mind... to some. But, after spending way too much time and money on Quadro cards I would say skip the Quadros. They are much slower (especially when comparing price) and although they are supposedly "tested" more, it seems to be a myth on whether or not that is helpful. I find there to be more problems with these cards than regular more popular gaming cards that are updated more frequently since bugs and problems are reported much quicker by many more people. Some people have a specific need or requirement for the Quadro cards, but if you don't need one or are not sure, stick with the Geforce and high-end ATI cards. This benchmark is not a Gold standard but should help illustrate the comparative performance. http://www.videocardbenchmark.net/high_end_gpus.html BTW, if you are running a quadro card on a mac Nvidia just released an update that fixes some various issues you may want to get. http://www.nvidia.com/object/quadro-macosx-256.02.25f01-driver.html
  14. Hi Everyone at Mograph, We are looking for a few more beta testers with professional AE experience for our upcoming plug-in Optical Flares for After Effects. http://www.videocopilot.net/blogstuff/lens_lg.jpg (re-built the AE lens flare for the good times) It's a plug-in for creating and editing lens flares with very detailed controls including dynamic triggering responses to simulate real lens flares. If you're interested in testing please send me an email at andrew (at) videocopilot.net and please include your system specs if you can, we're interested in a limited group of people that can really dig in. Our main goals are performance and your input on features and experience, if you have some time to test it out let me know. Thanks! Andrew Kramer VC Short Demo: I don't have a proper demo online but the last 5 minutes of our video blog has a short walk-through of a couple of key features. http://www.videocopilot.net/theblogshow/blog-show-optical-flares-bears-oh-my/ PS, I hope it is alright to post here, I didn't post this inquiry anywhere else...
  15. Inside of Crazybump there is a hammer to the bottom right, click on it and you can batch process a image sequence.
  • Create New...