Skip to content

Conversation

@stronk-dev
Copy link
Contributor

This PR:

  • Aborts processing if the output file exceeds 30x input file size.
  • Test that confirms whether one of the problematic input segments no longer triggers infinite outputs.

Note that the test probably no longer works. Will fix.

@j0sh
Copy link
Collaborator

j0sh commented Sep 3, 2025

Would it be cleaner to count output frames instead of output bytes?

@stronk-dev
Copy link
Contributor Author

stronk-dev commented Sep 7, 2025

Would it be cleaner to count output frames instead of output bytes?

agreed.

some potential area's we could use to detect 'error states' (based off of logs from the error state from the test segment):

  • the amount of decoded frames remains static, while encoded frames kept increasing. Now I can imagine that timejumps just happen from time to time where some degree of duplication is desired, but we can think of some rules here to trigger an exit.
  • The original input PTS remained constant. So a solution here could be to track the previous source PTS and count 'duplicate frames' in the filter. Maybe error if there's a second worth of duplicate frames?

@stronk-dev
Copy link
Contributor Author

Close in favour of #439

@stronk-dev stronk-dev closed this Oct 15, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants