To me the method you mentioned doesn’t look hackish at all. JavaScript, unlike other languages like PHP/C, does not differentiate between integer values and floating-point values. It knows only one numeric type: Number. So, you won’t find functions like is_int() or do something like if(getType(number) == ‘int’) in JavaScript. So, manually checking the presence of the decimal character is the only solution that comes to mind.
so if somevalue % 1 == 0 … then you know that somevalue is an integer otherwise it is fp
for testing type in 5%1 in google search to calculate the result. Then type in 5.1%1 in a google search to see that result. Try different combinations of numbers and you will understand what is going on.
Sorry for the confusion. The question I was asking was whether the program needs to distinguish between numbers being entered as integers or floating point. Does it care if a user enters 12.0 as opposed to 12?