Relationship between an integer N and the number of bits n required to represent the integer

I’m trying to understand the time complexity of the following code in terms of n.

Pseudocode for trial division

I understand that the time complexity of the algorithm is O(sqrt(N)). However, can someone explain how the person in the link below came up with O(e^(n/2)). In other words, what’s the mathematical relationship between N and n? Thanks.

Time complexity explanation

=================

=================

=================