• Brainly User
I assume this is Pythagorean Theorem where the square of the hypotenuse c is equal to the sum of the squares of the other legs, a and b, of a right triangle.

c² = a² + b²

To solve for b:
b² = c² - a²

 \sqrt{b ^{2} } =  \sqrt{c ^{2} -a^{2} }

Therefore to solve for b:

b =  \sqrt{c ^{2}-a ^{2}  }
If there is no sign between (a squared) and (b squared), then b = c/a