Until recently i have faced a few issues with not being able to process maps that are large enough because my PC was not powerful enough in terms of how much ram it has for the processor that i am using, the solution for this problem was to get more ram and/or upgrade my processor, now that ODM is being built for the apple M1 chip, how large of a map can one expect to process on it provided that the hardware config is pretty much locked to whatever apple is willing to offer. will it be feasible to purchase an M1 mac for the purpose of processing maps the size of around 3000 images on the machine, or is it only for learning and experimental purposes?
follow up question, also since the M1 chip has an inbuilt GPU in it, will the GPU be used by default for feature extraction?
CPU speed doesn’t have much to do with how large a dataset you can process other than making it take longer or shorter.
The size of the dataset is limited almost entirely by RAM.
As for the M1 GPU being used for accel, I’d venture not for a while. I’m not sure Apple is supporting anything other than Metal on ARM64/M1, so no luck with OpenCL
ok now that makes sense but yet again, how big a dataset can i expect to process on an m1 mac? is it worth it to buy an m1 mac just to process maps on? There is a lot of hype around the new m1 chip and i just want to know how it would perform. i guess i must just wait and watch
It has incredible performance per watt. No question.
It is a very competent processor, being somewhere around modern Intel i7s.
If you’re not in need of a laptop, I think you’d be better served by a desktop build for now.
has anyone run odm on the M! mac yet? what sort of performance is it yielding? whats the maximum number of photos one can process on it if one has the base version with the least ram?
i got my hands on a raspberry pi 4 model b, 4 gb ram and processed a dataset on it. benchmark results have been committed to odm benchmark repo, do check and let me know
This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.