I've got VVVV as a C# game, and it's not too bad. A bit of a flicker that I'm going to look in to now
I should mention that the cause of the flicker was due to the way in which I was initially triggering the 60 times a second call to my game loop. So this brings us on to the next subject in our "Adventures in C#", i.e. the timer.
From experience using libGDX for a VIC 20 emulator project targeting Android, I had started out putting the game loop logic from that emulator project in to the render method that gets called 60 times a second. In libGDX there is no control over how often that gets called. But using the delta time I could easily work out how many cycles of the VIC 20 emulator to execute for the elapsed time since the last call (as there is obviously no guarantee that it gets called exactly 60 times a second). After getting it working quite nicely on my own phone, I tried it on a few others and was surprised to see that for devices that were comparable in specs, I was seeing a reported FPS of 8000 in some cases and 2 in other cases. I was a bit surprised because what I'd read about libGDX suggested that the render method should be called 60 times a second. It turns out that this rate is affected by what you do within the render method. I was doing too much and different devices were handling this in different ways. So I shifted all of the logic not related to rendering in to a separate thread and kept the render method and therefore UI thread doing solely rendering. I then had to solve the problem of the other thread not being able to do rendering directly (as that is the domain of the UI thread). But I did get it working, and that solved the FPS issues I was having on the other devices. They all started reporting 60 FPS and everything worked nicely.
So coming from that recent experience working on the VIC 20 emulator (which was Java based), my first instinct when coming in to C# was to trigger the game loop with a separate thread, i.e. avoid running it in the UI thread. That is why I was getting the flicker, because the UI thread was sometimes redrawing the Bitmap at a point where the background had been drawn but the man had not yet been drawn on top of the background (a similar scenario could exist in AGI where a "save area" for an animated object might have been applied to the background but the new cel not yet having been drawn).
Rather than coming up with a way of managing this, I decided to change the Thread to using a Timer within the UI thread. This seems to work well, at least for the type of simple graphics that I'm working with. There is no flicker.
But this brings me on to the following article, which I spent some time reading over and studying in the process of considering a timer loop:
https://blogs.msdn.microsoft.com/shawnhar/2010/12/06/when-winforms-met-game-loop/My first instinct is to trust what Shawn Hargreaves says, as he was the original author of Allegro, the graphics library I was using back in the 90s when I wrote MEKA and one of the two C versions of the VIC 20 emulator. He also works at Microsoft, so such a blog post is possibly reviewed by other Microsoft colleagues. As I assessed some of the approaches mentioned, I came across a reddit discussion, in which Shawn's post is described as an abomination:
https://www.reddit.com/r/programming/comments/4o11h6/when_winforms_met_game_loop/I was curious to know what the experienced C# programmers on this forum thought of the two sides to this discussion. I'm assuming that the "abomination" comment is in relation to using the Idle event approach and therefore taking up 100% of one core. It doesn't sound ideal, does it? So why would Shawn Hargreaves suggest this? It seems that this is only for scenarios where you really need so much CPU. I certainly don't, so Shawn's first suggestion using a Timer is what I'm currently using and it is working well for me.
Any other suggestions? Anything to be aware of using a Timer?