freeCodeCamp/guide/english/certifications/javascript-algorithms-and-d.../basic-javascript/multiply-two-decimals-with-.../index.md

31 lines
1.1 KiB
Markdown
Raw Normal View History

2018-10-12 19:37:13 +00:00
---
title: Multiply Two Decimals with JavaScript
---
2018-10-12 19:37:13 +00:00
## Multiply Two Decimals with JavaScript
JavaScript uses the `*` symbol for multiplication. Multiplying floats is the same as multiplying integers. JavaScript only has the *number* type, which serves both integer and floating point numbers, it does not have a specific type for integers.
For example, if you were to multiply 2 integers, the numbers 3 and 5, then you could simply type:
```javascript
var product = 3 * 5; // product is 15
```
Now if we were to multiply two floating point numbers, 3.4 and 5.7, the product would be a float as well:
```javascript
var product = 3.4 * 5.7; // product is 19.38
```
### Hint 1
Think about what decimal number, when multiplied by 2.0, would equal 5.0.
2018-10-12 19:37:13 +00:00
> *try to solve the problem now*
2018-10-12 19:37:13 +00:00
## Spoiler Alert!
__Solution Ahead!__
2018-10-12 19:37:13 +00:00
### Code Solution
```javascript
var product = 2.0 * 2.5; // product is 5.0 because 2.5 * 2.0 = 5.0
```
#### More Information
* [DigitalOcean - How to do Math in JavaScript with Operators](https://www.digitalocean.com/community/tutorials/how-to-do-math-in-javascript-with-operators)