Author: dvida

The effects of Starlink satellites on meteor observations

The effects of Starlink satellites on meteor observations

Effects on different systems

The effect of Starlink will vary depending on the observation method. For example, back scatter meteor radars such as CMOR will not be affected at all. All-sky fireball cameras which aim to record meteorite falls (e.g. the NASA fireball network, the Desert Fireball Network, etc.) should also not be affected much because their limiting magnitudes are quite bright (they aim to record meteors brighter than magnitude -3).

A bright meteor recorded by the NASA fireball network camera in Huntsville

In contrast, meteor orbit and flux cameras have much fainter limiting magnitudes. State of the art Global Meteor Network cameras have limiting magnitudes of about +7 at 25 frames per second, and they might record many more Starlink satellites.

Co-added image of all meteor detections from the night of Dec 30-31, 2019. BE0001 camera in Grapfontaine, Belgium.

How bright will they really be?

In this section a way to simulate the Starlink constellation in Stellarium is given. This website provides a way to generate a satellite orbit TLE file which can be loaded into Stellarium:

I generated the TLE file with about 12,000 satellites (initially proposed configuration which has now been increased about 4 fold) and uploaded it to the GMN site:

You can load it into Stellarium by following these steps:

Here is how the simulation looks like:

All in all, the situation doesn’t look too bad. The satellites are few and not entirely bright, but some might interfere with meteor observations. Note that the satellites may be 4 times as numerous in the final configuration.

Meteor detection algorithms

Meteor detection algorithms work by keeping a running mean image (usually an average of the last e.g. 256 video frames) which is subtracted from every video frame. The algorithm then detects straight lines propagating in time that are above some standard deviation above that mean (usually about 2.5). These algorithms do detect satellites even now, but they can be easily filtered out by using an angular velocity threshold (meteors have geocentric velocities between 11 and 71 km/s, which translates to an angular velocity range of about 2 – 52 deg/s). Issues might arise with processing times, as the streak will have to be detected and then rejected based on the angular velocity.

The impact is not entirely clear at this point, but if the satellite streaks intersect with meteors, there just isn’t a good way to do good astrometry and photometry measurements, which may force us to reject those observations.

Science loss

In an extremely conservative scenario assuming that there’s a 25% reduction in observation time (I’m assuming that 2 hours out of an average 8 hour night will be unusable) – how will that influence observations of meteor showers and the sporadic background?

Firstly, all optical observations won’t be able to measure meteor showers from the helion source anymore (such as the Daytime Arietids), which are exclusively seen just after sunset and are large contributors to the total yearly shower flux.

Next, we won’t be able to observe small meteoroids on low-eccentricity orbits, as their geocentric velocities are the highest in the morning (more kinetic energy equals more light production, which means a higher detection efficiency). In the last few years we’re just beginning to understand that these meteors seem to be larger, more numerous, and of entirely iron composition, which we still cannot explain. We need more data.

The good news is that observations of well-known annual meteor showers will not be affected much, as long as there’s a large longitudinal coverage, which the Global Meteor Network aims to achieve. But these showers are not very interesting as they are well observed. On the other hand, rare meteor shower outbursts which may last only for hours (or less) and may have a large activity might not be observed if they fall into a “Starlink gap”. These outbursts are the main source of uncertainty in spacecraft meteoroid hazard models, thus their observation is critical. I find it ironic that spacecraft meteoroid hazard models might be hindered by spacecraft.

In reality, the impact will probably be negligible, especially if SpaceX reduces the brightness 25 times. This translates to a ~3.5 magnitude decrease in brightness, which would bring these satellites below the sensitivity threshold of our cameras.

Orbital debris environment concerns

My concerns are mostly about the orbital debris environment. My single grave concern is that if one Starlink satellite is pulverized in a collision with another satellite, this might shred the whole constellation within hours (read more about the Kessler Syndrome), faster than anyone can react to bring them down. I might be wrong, but I still haven’t seen an analysis of what would happen to satellites that are packed so densely in one part of the LEO.

I’m going to finish this post with a quote from Don Kessler’s 2009 overview which was written before Starlink was proposed:
Aggressive space activities without adequate safeguards could significantly shorten the time between collisions and produce an intolerable hazard to future spacecraft. Some of the most environmentally dangerous activities in space include large constellations such as those initially proposed by the Strategic Defense Initiative in the mid-1980s, large structures such as those considered in the late-1970s for building solar power stations in Earth orbit, and anti-satellite warfare using systems tested by the USSR, the U.S., and China over the past 30 years. Such aggressive activities could set up a situation where a single satellite failure could lead to cascading failures of many satellites in a period much shorter than years.
Starlink satellite constellation – possible interference with meteor observations?

Starlink satellite constellation – possible interference with meteor observations?

Since the recent launches of Starlink satellites, Global Meteor Network cameras have recorded a significant uptick in the number of false meteor detections on satellites.

At the end of every night, just before dawn, about half of all 150+ GMN meteor cameras observe a train of parallel satellites. This is how they look like on GMN co-added images:

In this particular case, the camera on the La Palma island, next to the MAGIC telescope at the Roque de los Muchachos Observatory, was recording the outburst of the Alpha Monocerotid meteor shower which can be seen in the background. Here is the video of the outburst (starts at around 2:00) and the satellite passage (around 2:13 in the video):

Fortunately in this case these 60 satellites did not interfere with meteor observations, but one has to be concerned how will our skies look like when hearing that there are plans to launch a total of 42,000 satellites! This might completely deny us to do any optical meteor observations as soon as 2024.

Here is a video that shows several observations of the Starlink constellation with GMN cameras:

And here is how the Starlink constellation looks like from other GMN meteor cameras (click on image for video):

IT0001, Farra Observatory, Italy

HR000D, Ciovo, Croatia

HR0007, Buzet, Croatia

RU000C, Cherkessk, Russia

RU000F, Ka-Dar observatory, N. Arkhyz, Russia

First automated GMN trajectories

First automated GMN trajectories

More than 100 meteor stations all over the world send their data to our GMN server every day. Until now, this data was sitting idle on the server disk drives. The last couple of months I focused on writing code for automated multi-station meteor trajectory estimation, and now I’m happy to report the first results!

The GMN serverside scripts are using the open-source meteor trajectory code from the Western Meteor Python Library, an implementation of the novel Monte Carlo trajectory solver which produces trajectories of superior accuracy when compared to older methods of trajectory estimation. The paper about it has been submitted to MNRAS and will be published soon.

In this first preliminary data release, we show high quality meteor orbits recorded with GMN cameras from December 2018 up until now (late August 2019). We only select meteor trajectories with the minimum of 6 astrometry measurements per station, minimum convergence angle of 5 degrees, maximum eccentricity of 1.5, maximum radiant error of 2 degrees, and the maximum velocity error of 10%. Low quality trajectories usually have a low number of data point or unfavourable observation geometry, even though the astrometry calibration is good. RMS, the software that GMN stations run, recalibrates the astrometric plate on every image that has a meteor detection, ensuring the high quality of solutions.

Figure 1 shows a Sun-centered ecliptic plot of 14 006 orbits in this first data release. The meteor orbit density is colour coded and several major showers can be seen (Perseids, Southern Delta Aquarids, Geminids, Capricornids, etc.), as well as the sporadic sources. The dataset also contains trajectories of several minor showers, e.g. 3 Camelopardalid orbits. I still need to write a module for orbital shower association, until then the association needs to be done manually.

Figure 1. Density plot of the GMN orbits in this data release. The density scale is logarithmic.


Figure 2 shows a plot of individual orbits colour coded by the geocentric velocity in the same coordinate system. As expected, the velocities increase up to the maximum of ~71 km/s close to the Earth’s apex in the middle of the plot.

Figure 2. Sun-centered ecliptic plot of orbits, geocentric velocity is colour coded.

Figure 3 shows a map of 45 stations which were used for trajectory estimation. There are more stations that report meteor observations, but they are either single-station or their calibrations and geospatial coordinates need to be confirmed before they are used for trajectory estimation.

Figure 3. Global map of participating stations. Numbers next to country codes represent the number of stations in each country that were used for trajectory estimation.


Finally, we give a link to raw data so interested readers may take a look for themselves: trajectory_summary
This data set is preliminary, thus one should keep in mind that there might be erroneous entries in it. If the data is used, we ask that you reference GMN and this blog post.


Clear skies,
Denis Vida

How the Global Meteor Network came to be

How the Global Meteor Network came to be

So I guess many of you guys have heard about RMS and GMN online, but probably don’t know how it all came to be, and who are the ones to “blame” for all of this. So here it goes…

The brief history of the Croatian Meteor Network

The story starts about 12 or so years ago when Damir Segon started the Croatian Meteor Network (CMN, He was working as a technician in a CCTV company and tried one of the most sensitive models he had for recording meteors. Lo and behold, he got hundreds of meteors during the 2006 Geminids, and that kickstarted the network (the Geminid video is here:, you can notice Damir’s love for 1970s British rock bands).

2006 Geminids from Pula, Croatia.

So it seemed that good hardware exists, but the problem was, as always, the software. The CMN ended up using SkyPatrol, free software by Mark Vornhusen which would take 1 minute block of frames (mind you, the video resolution back then was 384×288@25FPS) and “compress” it in such a way that only the brightest pixel values during that block of frames were saved, together with the frame index when the brightest pixel value occurred (we ended up calling this MTP, Maximum Temporal Pixel compression). This was all stored in a special BMP image file, so the compression ratio was 500:1. This compression worked well because meteors are transient phenomena and usually the brightest thing in one block of frames. SkyPatrol also had a very rudimentary meteor detector, but no possibility of meteor calibration.

So some of you might be asking yourself, but why we didn’t use UFOcapture? Well, it cost a lot of money as it does today, and back then a fairly powerful PC was needed to run it, while Skypatrol ran on ancient machines without a hitch.

At that point Damir involved Pete Gural, a software developer and amateur meteor enthusiast. He got his MeteorScan detector to work on MTP compressed images and output detections to a file. Damir developed a new method of astrometric calibration which was able to properly calibrate the cheap wide field lenses CMN was using (and it turned out, it can calibrate everything from narrow to all-sky lenses). Finally, UFOorbit was used for trajectory estimation. This was all in 2007, 2008.

Damir Segon (left) and Pete Gural (right) at the IMC2010 conference in Armagh, Northern Ireland. September 2010.

In 2009 I actively joined the CMN. I remember running Pete’s MTP detector on Skypatrol files and how horribly clunky the data flow was. After a few days of that, I wrote the first CMN Python script which took care of automating the data flow. Raw images in, detections out…


Some time in the late 2000s, the guys from the Meteoroid Environment Office (MEO) at NASA Marshall commissioned and paid for a study of a next generation meteor network. The study indicated that large gains can be achieved with contemporary technology, compared to old systems. They have put in a proposal, but they were surprised to learn that a stunningly similar proposal by Peter Jenniskens was awarded money instead of theirs.

Due to his unique expertise, Pete Gural was employed to develop the CAMS software from ground up. CAMS incorporates some of the know-how gained during the development of CMN procedures. For example, the FTP compression (Four-frame Temporal Pixel) is based on the MTP method. The CAMS network soon expanded rapidly and it’s results significantly contributed to our knowledge of the Solar System and the meteoroid environment.

The CAMS recording software was given for free, but it remained closed source and only Windows exe’s were available. The code for calibration and trajectory estimation was not available and the restriction of its distribution (non-exportable from the US) was used for data flow control – nobody except Peter Jenniskens could compute CAMS orbits.

Some time in 2013, the CMN switched from Skypatrol to the CAMS recording software, but still used its own procedures for data calibration and UFOorbit for trajectory estimation. This was achieved by “gluing” together all of these different exe files with a Python script which took care of recording, data format conversion, archiving and uploading the data to the CMN server. Some of this code was reused for RMS, but most was just spaghetti code. This was my first big Python project and I have learned a lot in the process.

RMS – the beginnings

Some time in 2014, I met Dario Zubovic for the first time. He was still in high school then and was very keen to use Raspberry Pis for recording meteors. The first version of the Pi was only barely powerful enough to read raw video frames from EasyCap to memory in real time. In 2015, Raspberry Pi 2 was released. It had 4 cores and 1 GB of RAM, which was very usable. In a matter of weeks (in June 2015), Dario and I had our own version of the FTP compression running in real time and saving FF files to an SD card. The fireball detection was also developed from scratch in about a week, and we had the first version of the star and meteor detection working by the end of the summer of 2015. Dario basically did all the “heavy lifting” in those early days.

There was one particular thing that was very difficult to do – name the project. Dario, Damir and I spent days thinking about it. We couldn’t think of anything good, so we went with Raspberry Pi Meteor Station (RMS) as a ‘temporary’ name, as we needed to call the GitHub repository somehow. The name stuck, and here we are…

During that summer, we have also participated in the Hackaday prize competition, you can see out entry here: We made it to semifinals, I have a shirt to prove it! The main prize that year was a trip to space (that’s right), but it wasn’t meant to be… We have dropped the Asteria name since and continued to use RMS operationally.

Dario and I recording a video for Hackaday at the observatory in Pula, Croatia. August 2015.

Because the project looked so promising, and the fact that most of the CMN ran on ancient computers that would have to be replaced ASAP, Damir got very excited and announced to CMN camera operators that there will be a Raspberry Pi solution by the end of the summer. Of course, that was very far from reality.

In the fall of 2015 Dario started college, I was wrapping up my Master’s and applying to University of Western Ontario for a PhD with Peter Brown. By summer of 2016, the RMS software could reliably record, compress, detect fireballs, stars and meteors from analog cameras on a RPi 2.

There was a hiatus in development until the spring of 2017. Pete Gural came for a visit to UWO in November and brought me a Sony ICX673 based camera and I set up one testing station looking out of the meteor lab at UWO. This testing period helped to weed out many bugs, but the whole project was basically an afterthought to me.

The jump from analog to digital

This was also the time when I was very concerned about the status of the amateur meteor community. Analog cameras were slowly disappearing, EasyCap dongles were bad, good capture cards were very expensive and old TV cards were impossible to find anymore. Sony announced that they would stop manufacturing CCD cameras. Things looked really grim. At this point I was a bit discouraged about the whole project and didn’t do much. What’s the point if there won’t be any cameras to use?

Fortunately, in January of 2017 I met Mike Mazur who came back to Canada to do his PhD after living in Norway for more than 10 years. He was very enthusiastic about the project because he tried to start a similar thing in Norway. In June that year we mounted the first permanent station at Elginfield, Ontario.

Mike Mazur (left) and I with the first permanent RMS station in Elginfield, Ontario. June 2017.

That summer I was mentoring a group of very promising high school students at the Visnjan School of Astronomy in Croatia. We deployed an array of 4 analog cameras, each one running on a Pi, and were able to obtain first double station orbits using RMS data. During that school, I wrote the first version of SkyFit (astrometry and photometry calibration tool) in one afternoon.

Shortly after we got the first RMS orbit during VSA2017. From left to right: Me, Patrik Kukic, Filip Parag, Anton Macan. August 2017, Visnjan, Croatia.

In September 2017 we participated in the International Meteor Conference in Petnica, Serbia. There we attended a lecture by Mike Hankey who presented an idea of using cheap IP cameras with IMX290 CMOS sensors. The videos that Mike showed weren’t that great, but they intrigued us.

A few days after getting back from the IMC, I ordered an IMX225 camera and a 4mm f/1.2 lens. Some time in mid-November 2017, Mike Mazur and I were sitting on my living room couch, the ethernet cable stretched through the kitchen out to the patio, the camera pointing to the clear night sky. It took a while to connect to the damn thing… “HOLY SH*T!!!” were the first words when we saw the live video, and we saw a bright meteor streaking through the field of view a few seconds later. After adjusting the gain and some other settings, we estimated that the limiting stellar magnitude was around +5.5M.

The first RMS IP camera with the IMX225 sensor and 4mm f/1.2 lens. December 2017.

Bright meteor recorded with the camera on the first night of testing from London, Ontario.

In a matter of days I was able to grab the video from the IP camera on the Pi. Luckily, the Pi supports GPU decoding of H264 video, which meant that there wasn’t any CPU overhead and that RMS could run without any issues on 720p video.

The big concern about CMOS sensors was whether good photometry calibration can be done on them. Everyone was concerned that they were not linear. In December 2017 we have shown in a paper published in WGN that the CMOS sensors are perfectly fine for meteor photometry.

These cameras also had a different readout than classical interlaced analog cameras. They have a rolling shutter, which basically meant that every row of pixels started integration at a different time, with a constant phase shift. Patrik Kukic worked hard with Pete Gural and myself to develop a correction for the rolling shutter effect on meteors, which resulted in another publication in WGN. The conclusion was that the rolling shutter effect can be mitigated by applying a simple correction to time stamps of meteor detections.

Global Meteor Network is born

At the end of 2017, Aleksandar Merlak got seriously involved in the project and set up a few IP cameras at this house in Hum, Croatia. After pulling out all of his hair while trying to get everything to work, he kindly offered to battle the insanity that is the Croatian bureaucracy and start a company which would sell assembled cameras for a very reasonable price.

Mike Mazur and I assembled one system and put it for sale on ebay in January 2018. Pete Eschman from Albuquerque bought it. Later that year, Pete would buy 14 cameras and deploy the first large RMS meteor network.

The rest of that year was spent polishing the code, making sure it’s scalable, developing ways of easily deploying it. In June 2018 the first pair of stations with overlapping fields of view was installed – a camera in Tavistock, Ontario was paired with the one at Elginfield. At that point the RMS software entered the beta testing phase, in which it remains until now.

The first ever RMS station that was independently set up by a community member was in France – Jean Marie Jacquart bought an IMX291 camera and I sent him a 64GB SD card with the RMS set up on it. Back then I did not know how to clone and shrink RPi cards, and I didn’t have anywhere to upload all 64GB. Luckily, easily deployable SD card images are now available to everyone!

I have presented one RMS system at the International Meteor Conference in Slovakia and generated some interest. After that, I lost track of new stations, they just kept popping up all over the world. As far as I know, people are running RMS cameras in 11 different countries, maybe even more!

The first true test for the system came during the 2018 Geminids. We have shown in a short article that RMS systems produce reliable results in near real time, as it took less than 24 hours to estimate the orbits and write an article about the observations from the peak night. We have also shared the raw observations and orbits that were observed that night, the first immediate data release by the amateur community.

The peak night of the Geminids (910 meteors) from Hum, Croatia. Credit: Aleksandar Merlak

The peak night of the Geminids from Albuquerque, New Mexico. Credit: Pete Eschman

Global Meteor Network – the future

Throughout the history of meteor science, people kept reinventing the wheel. There exist too many implementations of the same thing out there. E.g. Sirko has his astrometry calibration code, UFO has it, the Czechs have theirs, Rob Weryk wrote some while at UWO, Damir wrote one for CMN, Pete Gural wrote his own for CAMS… All of these codes are closed and used by only a handful of people. How can we be sure that they are all correct, and how can we trust everyone’s results?

It’s been enough. It’s time to write one transparent meteor software library that everyone can use and understand how it works, so we can stop this nonsense. Too many work hours of smart people have went into repeating what others have done, hours that could be used doing something actually scientifically useful. The RMS project aims to do that.

So even before the recent rapid expansion, we had the idea that this project needs to be global and solve several crucial issues older meteor networks had:

  1. They were usually national and very fragmented. Many different data formats were used, same meteors were usually recorded using different software, calibrated with different calibration methods, etc. There was basically no cooperation because of that. There were exceptions of course, the IMO network and CAMS.
  2. All previously used code was closed source. The methods used were not transparent and the development was done by only one person. There are many people in the community that have good ideas, and time and expertise to implement them, but no access to the code base.
  3. Because old meteor systems and software were expensive and data reduction was time consuming, the final data products were expensive as well. That’s why they weren’t publically shared until scientific value was extracted from them, or people just continued to sit on the data due to the lack of time to process it.
  4. Data reporting was slow and data releases were usually years behind. The only network with real-time data release is the NASA Fireball Network, and CAMS has real-time data reporting, but the data is not available.
  5. Because the financial bar of entering the realm of amateur video meteor observation was set very high, students or people from lower income countries were not able to participate in such projects.

By making the meteor systems cheap and data reduction automated, GMN aims to reduce the cost of data to a minimum, which justifies making the data public. For that reason I’m introducing a measure of meteor data cost – dollars per meteor per year ($/met/yr). Let’s say that one RMS systems costs $450 and be conservative in our estimates by assuming that it will last 2 years, record 10 meteors an hour for an average night of 8 hours, and that there were 150 clear nights in a year. Disregarding the cost of electricity, the Internet connection, etc., the RMS system has a meteor cost of ~0.02 $/met/yr. Furthermore, assuming that 2 stations are needed for meteor orbit estimation and that double station meteors are observed with an efficiency of 50%, the conservative estimate of the cost of a database of 100 000 meteor orbits is $8,000. This final estimate is very dependent on the number of meteors each camera records. For example, in a dark sky environment and a narrower lens, the number of meteors per hour can go up to 20, which means that the final data product can be had for a mere $4,000 (nine RMS systems), which was a cost of only one pair of previous generation stations.

The current focus of the project is polishing the code and expanding the community. We are slowly introducing community building tools (the website, forum, mailing list) and the RMS wiki will be set up soon. One big thing that is currently missing is documentation – we hope that the community will help to write it.

All stations report their observations to the server at the University of Western Ontario, but currently the data just sits there. By the end of 2019 we plan to have real-time trajectory estimation and orbit publishing. Data will be put into a database that everyone will be able to access. At this point we will have to employ somebody with a good knowledge of front end web development to create a user friendly website.

Finally, the long term plans include writing proposals for grants which would enable the expansion of the network and donation of meteor systems to schools. The heritage of the Croatian Meteor Network is dedicated work with prospective students, and that is something we believe has been neglected by the meteor community. The Global Meteor Network aims to raise global awareness of the need for night sky surveillance, and through meteor astronomy introduce students to the wonders of astronomy and STEM fields.

The meteor group students during the Visnjan School of Astronomy. August 2015.
Meteor camera FOV visualization tool resurrected!

Meteor camera FOV visualization tool resurrected!

About 10 years ago Geert Barentsen wrote an awesome tool for visualizing fields of view (FOVs) of meteor cameras that was hosted on ESA webpages. Unfortunately it stopped working some years ago and the meteor community was left without this awesome tool!

But, because the source code was available, we were able to resurrect and host it on our webpages! You can access it here:

FOV3D tool

This tool allow you to input the location of your meteor station, camera pointing and the size of the field of view. It will then generate an KML file that you can load into Google Earth!