Daz is promoting Hexagon/Bug fixes
RFB532
Posts: 94
That's great Daz is promoting Hexagon. But it desperately is in need of bug fixes and/or an update. I had stopped using it about three years ago because it would crash constantly. So I figured I would give it another chance and I installed the latest version with my Windows 10, 64 Bit, and I have a high spec computer to use with it. So loaded it up, did something simple and made some text (the word "Hello") moved the tessellation slider a few notches then...BAM!!!!! IT CRASHED! I would really like to have this as a reliable and stable option in my workflow since it is integrated with Daz so well. It would be great if it had an update soon…Thanks
Comments
Yes it seems strange they are pushing Hex all of a subben ?
I don't think they are pushing it. They are giving it away for free. I use the modeler Silo which is 64bit and stable. Blender's interal modeler is also free and a more stable modeler than Hexagon. I would recommend new user try Blender instead of Hexagon and look at sicklyeld's tutorials for moving things back in forth to daz studio.
Blender's UI sucks.
I agree, the Blender UI is the Anti-Christ. I longed hope that DAZ would either support Carrara or Hexagon again or give DAZ viable modeling functionality. Perhaps there will be a new revitalization of Hexagon at least....: )
See here: https://www.daz3d.com/forums/discussion/comment/3056261/#Comment_3056261
That's why I've never learned Blender. I've tried it about 3 or 4 times and it's always the same thing. I open it and start following a tutorial or Youtube video. Then, I accidentally click on some part of the UI and half of my controls disappear. Since I don't know anything about Blender, I don't know what I clicked wrong or what I need to do to fix it and I'm totally lost. I can uninstall and reinstall it, but it remembers UI settings (back then I didn't know how to wipe it from the registry) so I'm still lost.
Blender (and a lot of other freeware) are made for professional level users, but that doesn't help hobbyists trying to learn. It needs an 'idiot mode' that keeps you from messing with the advanced stuff or warns you before you're about to close half of the UI. It's a great program, but has a steep learning cliff.
I'd love to see a new Hexagon built from the ground up by Daz.
The main reason why I still use Hexagon 1.21, after 9+ years, for modelling 90% of my objects is its UI. I had issues with the 2.x UI, when it came to UV mapping. I hope the new Hexagon release has fixed all that UV pinning nonsense from the past. I would like to do my UV mapping inside of Hexagon, since I model in it and I can create stuff very quickly because of its UI.
modo is the same way. Click on the wrong thing and *poof-be-gone* is what's left of your UI. Simple modeling does not have to be that over-engineered with such apps.
Blender's UI is completely customizable. If it sucks, then change it!
Well, it's about damned time!
I am pleased beyond belief at the announcement that Hexagon is going to be updated with bug fixes and made 64 bit compliant.
My suggestions:
Blender has some impressive capabilities, but the interface is so tedious, buggy, and non-intuitive it's not worth the effort. Everything is reliant on a damn key command and if you accidently push the option key instead of shift it blows up everything. You're better off just biting the bullet and buying MAX or Lightwave. I love Hexagon but I barely use it anymore because it hasn't been updated so long I wonder if it's even Y2K compliant.......Let us hope the Hexagon Phoenix rises from the ashes....: )
Well, I don't want to distract the main topic of this thread, but I felt I should set the record straight. MAX is $700 and Lightwave is a $1,500 annual subscription...that is, payable EVERY year. I'm sorry, but that's not just "biting the bullet". If I were to coin a phrase, it would be that those options are more like "costing an arm and a leg".
To bring this back on-topic, even though I'm a Blender proponent, I will say that "free" is not a requirement for me. I'm willing to pay a modest amount for Hexagon because Hex has a very visual, accessible, and immediate kind of workflow experience. I'm just not willing to pay to squint all day at a tiny UI and I'm not willing to pay for constantly having to restart the app after it crashes.
But DAZ should not treat this lightly or start thinking that people like me will just jump on it without looking first. I will only pay for it if the UI is easy to read and configurable, it is 64 Bit-capable, and has a REASONABLE update schedule. No more squinting at my big screens and no more of this abandoning it for years and years without a peep from DAZ.
Now if only they'd listen to me about Bryce.
I just checked and it appears that the last time it was updated was 2006, so it's quite likely Y2K compliant.
In actual fact Daz didn't acquire Hexagon till around 2005/2006 so that flippant remark about Y2K compliancy was really rather unneccessary, as was the whole Y2K panic striken mania floating around at the end of the last millennium. Talk about a storm in a tea cup. Well actually we couldn't talk about that storm in a teacup on the Daz forums, as they didn't exist then.
The trick is learning its UI in order to change it. That's the beauty of Hexagon's UI. You just use it from the start without it being in your way.
Hex 2.5 was released in 2012.
Folks, I was joking about the Y2K thing but Hex is still a product of a bygone age that needs an update.
Well, Y2K is not anything to joke about. It certainly had the potential to cause major problems, particularly in database logging, accounting, and real-time or imbedded operating systems. Hardware was already compatible, and most OS providers had already gotten their act in order (or no longer used "dates" for logging). But application code still needed a review and in some cases, a significant rework.
There is an excellent Wikipedia article on Y2K. If that was before your career, I highly recommend you to have a look. If you doubt the potential of the impact, I suggest you scroll down in that article and read the first case study on how serious the problem was (Sheffield and Downs Syndrome incorrect risk calculations). That one actually resulted in deaths.
From 1997 until about the middle of 2001, that was one of my major job functions; helping to prepare for Y2K. Yes, the effort continued well past January 1st, 2000. Mainframe computer programs written from the 1960s and not updated much into to the mid-80s (and even less in the 90s) sometimes used only 2 digits for the year. In some cases, they only used a single digit, or some permutation of the 8 bits within a single byte, which can still store only one of 256 combinations, none of which would represent a 4 digit year without some assumptions being made.
The issues that came up would makes basic date mathematics end up giving the wrong answers after all 4 digits changed. Or 3 months after they changed. Or a year later. Sometimes, they caused the application to crash (but rarely the OS). Other times, they were more evil, because the bug would not cause a crash, but instead would give incorrect information to the user, potentially resulting in the wrong decision being made.
Imagine calculating the terms of a 36 month loan when the application mistakenly thinks you've entered a "negative 98 year loan". Or a 28 year old pregnant mother-to-be now being calculated as a 72 year old (zero minus 72 and truncate the sign), who if pregnant, would have a whole host of DIFFERENT risk factors than a 28 year old.
In some cases, date emulation tools could be used to find and fix these issues, but it still required human beings with specific skillsets to make the corrections to the source code, recompile the programs, and retest the logic, and eventually put those changes into production.
Many companies treated this as a very serious thing, and so they spent a lot of time, labor, and money looking for, then fixing application inadequacies in code. That work needed to be done, and in my opinion, that's the main reason that application problems were few.
Even the US Government had acknowledged that Y2K could be a serious problem, so the government created the "Year 2000 Information and Readiness Disclosure Act". If a company wanted to reduce its legal liability and lower its risk of being sued by shareholders, employees, customers, and vendors, then it HAD to be engaged in the Y2K effort.
I highly recommend that Wikipedia article as a starting point for reading. Given the sheer amount of legacy source code in the world back then, Y2K is, in my technical opinion, more of a success story than anything else.
It does need an update but I wouldn't call it from a by-gone age. Infact, the modeling and UV features are still top of the line. This model took me a day to model and map with the current Hex program:
When I first started modeling for game mods such as Need for Speed and Delta Force, I had to do it with hex editing and would take months to do this same model (seriously, that computer would crash with something this big).
"Current", as in the version we all have? Or the private beta?
Y2K was a joke anyway. A lot of hype about nothing. The media was fake news even back then.
Its great to have hard surface modeling and organic modeling in the same app, as well as the CAD like units system.
The current public, I would never make a commercial item using a beta release. It would only take one broken part of the software to ruin your day ;)
In today's age, I rarely have applications, even complex ones, crash. And certainly not ten times or more in a few hours. That used to be acceptable, but is no longer so. The tolerance for crashy-crashy apps is not what it used to be, because there are MANY software products out there that don't crash. I am eagerly awaiting 64 Bit and bug fixes. Both are critical for me to start using the program again.
I don't know what your history was, but I suspect that Y2K may have happened before your career, or your career never dealt with the technologies and application code that was most susceptible to errors. I assure you, Y2K was not a joke, and it was not fake news. It was a legitimate and real risk. As I said in my prior post, there was some fallout and some of it was very serious. It could have been worse but was not. From personal experience, I know this was due to the high priority it was given.
I guess having doubters means that those of us who put our hearts and souls into the work just kind of made it look easy.
What a lot of people don't realize is that the programmers back in the late 1960's and early 1970's that wrote some of those programs never expected them to still be in use twenty to thirty years later; that they were is testament in and of itself to how efficiently they did the job required.
I was there. I wrote a lot of code that would fail in 2000. Some of it would fail even sooner depending on BIOS/computer, etc. The idea was that no one would still be using the code/computers that late in the game. When people are told that their dates won't work after 1999, they buy a new computer and update their software if they need to. It's not rocket science. And no one of any importance was still using junker computers when 2000 happened. So it was just hype.
I see it now. Your use of the words "BIOS" and "buy a new computer and update their software" makes it obvious that your perspective is entirely PC/end-user based, and you have no experience with the other types of systems more widely in use back then; namely the midrange and mainframe computers of the time and how their applications fit into the architecture, as well as the fact that companies running the application software were often also the application developers.
Today, software vendors are what most people think of as "devs", and very little is written in-house. But back then with those types of systems, "buy a new computer" was not the answer. And "update the software" meant that the company's own in-house devs would be tasked to go through the code, sometimes line by line. Millions of lines of code, within thousands (or tens of thousands) of application programs, written in cobol, assembler, fortran, pl/1, or other languages. This required skill and expertise specific to the language used for a given application. A good cobol programmer might not make a good assembler programmer.
IT has come a long way in 17 years, but there are still many of those old systems around, even now. If you book a flight somewhere, it's very possible that your app is accessing a mainframe computer. You just don't know it because your PC's web browser or your tablet's app is serving you as the front-end, hiding the true technology from you. There really is a lot more to the story than you are aware.
Backwards... kinda.
I love modeling in Carrara, but I'm not what I would call a modeler. I've messed around with Hexagon and its wonderful bridge from and to DS and am very excited to hear that Daz3d is going to pour some attention into it. Good move for them, in my opinion.
Like Silo, Hexagon is a modeler. Period. That has a certain elegance about it when modeling is what needs to be done.
I'll always (Always) stand by Carrara, which allows me to build vast expansive scene filled with anything and everything - and it's all animated, ready for the six cameras to film (render) the scene as I sleep using its awesome Batch Queue render - but wait... I need (inset any sort of needed thing)... no worries. I just quickly and easily model something any manner of different ways - including polygonal and spline modeling, assign whatever materail domains I needs, create the shaders... Rock On... good to go!
Hexagon having that bridge from DS and back again, or however one ends up using it, is a magical thing to have. I'm sure that once it's upgraded to 64 bit and fixed up a bit, it will be able to stand on its own with or without DS, no problem. Grow out of that and perhaps give Silo a try.
Anyway, I'm glad they're doing it. Perhaps it will give them some extra incentive to also work on Bryce and Carrara as well - eventually.