I'm not sure I understand what it means when it says "The value is predefined buckets at 100kbps intervals." Could someone elaborate on that? Also when it says weighted average does that mean that some of the bitrate values don't contribute as much as others? e.g. a bitrate value for a BitrateChange event might contributed less than the bitrate for a Play event.
On the Quality Parameters page there are two entries for "Average Bitrate". The first is in the "Quality Metadata" section and this one corresponds with the Average Bitrate Dimension. When it says, "The value is predefined buckets at 100kbps intervals," it's referencing how the value will be displayed as a 100kbps range. So if the average bitrate is 300kbps, it will get reported as (300-399).
The second "Average Bitrate" entry is in the Quality Metrics section The metric will give you the total Average Bitrate value for all user sessions. During a user session, the average bitrate is calculated as (Sum of (duration_in_play_ping * bitrate_in_play_ping)) / total_play_duration.
As a side note, it doesn't look like Average Bitrate functions like the documentation says it does. From what I can tell, it just keeps a running total of all bitrate values that have been reported. Has anyone else seen this?