Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Saving partial path length should be disabled once the tissue type number exceeds a limit #54

Open
fangq opened this issue May 28, 2020 · 0 comments

Comments

@fangq
Copy link
Owner

fangq commented May 28, 2020

The below thread reported a bug in mmc that crashes mmc when continuously varying (i.e. tissue type# = elem#) is used. The source of the issue is the extensive memory allocation when saving partial path length feature is enabled.

https://groups.google.com/forum/#!topic/mmc-users/IfnJBIAR4Dg

-------- Forwarded Message --------
Subject: Re: [mmc-users] Large Geometry with a fine resolution: > 500 000 nodes
Date: Wed, 27 May 2020 08:36:32 -0700 (PDT)
From: Franz Steiermark [email protected]
Reply-To: [email protected]
To: mmc-users [email protected]

Hello Qianqian,

Thank you very much for the fast answer. I will install mcx and try it again.

Unfortunately I have to assign varying optical properties for each element. I wrote a small test code which recreates the issue I have and attached it to this Email..

The code I sent you runs without errors on my computer. Three locations are marked with arrows that cause a fatal crash of matlab.

Franz

Am Mittwoch, 27. Mai 2020 16:08:37 UTC+2 schrieb q.fang:

On 5/27/20 5:29 AM, Franz Steiermark wrote:
Hi,

I created a mesh with over 500 000 nodes and over 800 000 tetrahedrons where each tetrahedron has it's unique set of optical paramters.

Depending on the computer I use matlab crashes if I increase the number of photons.

On my laptop (8Gb RAM, i7-5500U @2.4 GHz) matlab crashes above 90 photons.
On my workstation (64Gb RAM, i7-8700 @3.2 GHz, AMD Radeon R9 200/ HD 7900) matlab crashes above 400 photons

The problem seems to be the large matrix created when using a large detector, as the detps.ppath variable has a size of 90x800 000 for 90 photons.

Is there a way to increase the number of photons and still account for absorption in the model?
hi Franz

generally speaking, the total number of nodes in the mesh is less relevant to speed compared to the density of the nodes in the region just next to the source. this is because photons packets have a big chance to traverse around the source than the rest of the mesh, so, it matter less how dense the rest of the mesh is.

in the past, we've run mesh based simulations of this size with mmc and it had no issue. I think it may be caused by the fact that you assign each element with a unique optical property. If you run the OpenCL version of MMC on a GPU, I am pretty sure it won't run because the constant memory is not big enough for the property data.

if you really have to define continuously varying optical properties, my suggestion is to use mcx - you can define mua/mus, or combinations of other optical properties per voxel.

see announcement here

https://groups.google.com/forum/#!msg/mcx-users/LIwnrm6dCys/QYYOL7zoCgAJ

nonetheless, I think mmc is supposed to run this without a problem on a CPU. if you have a test code, feel free to send it to me via dropbox/google drive, so I can take a look.

Qianqian
Thanks in advance,
Franz

TrialOptProp_testcode.m.zip

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant