The run time of BFS is O(b^d)

b is the branching factor d is the depth(# of level) of the graph from the starting node.

I googled for a while, but I still don't see anyone mention how they figure out this "b"

So I know branching factor means the "# of the child that each node has"

Eg, the branching factor for a binary Tree is 2.

so for a BFS graph, is that b= average all the branching factor of each node in our graph.

or b = MAX( among all branch factor of each node)?

Also, no matter which way we pick the b, still seeming ambiguous to approach our run time. For example, if our graph has 30000 nodes, only 5 nodes have 10000 branchings, and all the rest 29955 nodes just have 10 branchings. and we have the depth set up to be 100.

Seems O(b^d) are not making sense in this case.

Can someone explain a little bit? Thank you!