Publications
Yahoo! I managed to get the distributed processing working :) The approach i chose was to use AWS's container service (ECS). I made a few tweaks to the correlation library so that with an optonal parameter it can dump out candidate matches to files (in python pickle format). These are then distributed to containers in groups of 20, cutting runtime and cost dramatically. Previously it took about on...
Now with added containers!
Jul 23, 2022
98 Vues
My biggest bugbear is that as the network has grown, so has the time - and cost - of data processing. Each morning, we process as many as 25,000 individual potential meteors from around 100 stations, spaning the last two nights (this is necessary to catch any late arriving data from the night before last). That sounds a lot but actually its only around 120-150 per station per night, and if we have...
Working on Distributed Processing
Dec 13, 2021
116 Vues