Real-time GCs can only guarantee a certain number of deallocations per second. Even with a very well-designed GC, there's no free lunch. A system which manages its memory explicitly will not need to risk overloading its GC.
I think you have that backwards; they can only guarantee a certain number of allocations per second (once the application hits steady-state the two are the same, but there are times when it matters)