0

Throughput

I attempted a broadcast today with custom RTMP. It failed. My internet was recently upgraded to 40mbps upload, so I wanted to try a higher quality broadcast. I had the throughput set to 3mbps, where I previously had 1.25 mbps. However, it failed - and I needed to revert to my usual settings.

After I've had a bite to eat etc, I went back to Wirecast to re-test to find out what went wrong. The 3Mbps throughput setting (apple encoder) was sending at around 7Mbps. For testing, I stopped the stream and changed the throughput to 1.25Mbps. That worked normally. I re-tested at 2Mbps. That worked. I re-tested at 3Mbps which was my previous setting. That worked. The question is, why was it initially trying to transmit at more than double the configured setting.

What is even more interesting. I am aware of issues with the Apple hardware encoder with video transitions - but this was just regular video content with no transitions. There should have been no reason to be that high.

Why would Wirecast for a few minutes when configured at 3mbps be sending at this high rate? And why when repeating the test would it drop to a lower level? It is this type of instability that frustrates me the most...

1reply Oldest first
  • Oldest first
  • Newest first
  • Active threads
  • Popular
Like Follow
  • 2 wk agoLast active
  • 1Replies
  • 14Views
  • 2 Following