Dynamic Memory Allocation with Hyper-V 2012 R2
-
Not something that I have come across as an issue.
-
I'm starting to wonder if it's not a limitation with the CAD software that I've stumbled upon. Will keep this post updated with my findings.
-
Have you tried disabling dynamic memory and setting 16gb just for testing purposes?
-
@IRJ said:
Have you tried disabling dynamic memory and setting 16gb just for testing purposes?
I'm going to after the test I'm running now finishes. It's interesting that the app crashes, however if I just close the window, the translation resumes and creates CAD data like it's supposed to.
-
@bill-kindle Are you using NUMA?
-
@IRJ said:
@bill-kindle Are you using NUMA?
I haven't gotten that level of nerd yet. For a minute I thought you meant numa numa guy.
-
@Bill-Kindle said:
@IRJ said:
@bill-kindle Are you using NUMA?
I haven't gotten that level of nerd yet. For a minute I thought you meant numa numa guy.
I went through that in my 70-410 training.
-
@Bill-Kindle said:
@IRJ said:
@bill-kindle Are you using NUMA?
I haven't gotten that level of nerd yet. For a minute I thought you meant numa numa guy.
Glad I wasn't the only one...
Youtube Video -
Maybe I missed it but when does CAD crash?
-
A.J., I think that's how you got famous
-
@Bill-Kindle said:
A.J., I think that's how you got famous
I'm not sure how to take that...Why you!! Thank you!!
-
@ajstringham said:
@Bill-Kindle said:
A.J., I think that's how you got famous
I'm not sure how to take that...Why you!! Thank you!!
You are the numa numa guy! lmao /jk
-
@Bill-Kindle said:
@ajstringham said:
@Bill-Kindle said:
A.J., I think that's how you got famous
I'm not sure how to take that...Why you!! Thank you!!
You are the numa numa guy! lmao /jk
I'm going to take that with a grain of salt and say thanks!
-
@Bill-Kindle said:
I'm working with local storage which I already know is going to be part of my bottleneck
Not part of your question, but I was wondering, why are you calling this part of your bottleneck? because you only have a RAID 1? You could remove this bottle neck for around $150 with an SSD drive, assuming this is primarily for testing. If it's for production, well that's another story.
-
@Dashrender While true, it's both a funding problem and a hardware limitation on what I have to work with. With that said however, I have disproven the fact that it is a dynamic memory problem as the data translation uses all 16GB of RAM while running the job but as soon as it's done, allocation drops substantially without the OS even knowing the difference. Also, running a second translation job this time using an iSCSI LUN that I setup on my new Synology NAS (just a small 10GB LUN to store and write data to as part of this test) is running much faster than with the local storage, but I'm using consumer level drives in the host machine as opposed to WD Reds in the Synology NAS. It's a very noticeable difference.