+5 votes
1 view
in AI and Deep Learning by (790 points)

I am trying to write a code in python exercise with node.js using TensorFlow.js for converting Celsius into Fahrenheit using machine learning. I am getting random answers from the code. Although I have tried many input shapes but not getting the desired output. I have checked that python and node.js have the same models.

Here is my code:

const tf = require("@tensorflow/tfjs-node")

function convert(cel){

   return (cel*1.8)+32)


var celsiusTemp = []

var fahrenheitTemp = []

for (let in = 0; in < 20; in++) {

   var value = 100; // Keeping this only value to ensure that Tf knows the answer I also have tried with 20 different values but doesn't work


   fahrenheitTemp.push([convert(value)]) // Push the answer (212) to the fahrenheit


var models = tf.sequential();

models.add(tf.layers.dense({inputShape:[1], units: 1}))

async function trainModel(models, inputs, labels) {

   // Prepare the model for training.  


     optimizer: tf.train.adam(),

     loss: tf.losses.meanSquaredError,

     metrics: ['accuracy'], // Accuracy = 0



   const es = 500;

   return await model.fit(inputs, labels, {


     batchSize: 20,

     verbose: false 



c = tf.tensor(celsiusTemp)

f = tf.tensor(fahrenheitTemp)

var trainingMachine = trainModel(models, c, f)


   var predictedValue= models.prediction(tf.tensor([[100]]));

   predictedValue.print(); // Prints a random number

   console.log("Real answer = "+convert(100))





Real answer = 212

1 Answer

0 votes
by (1.7k points)

The main problem with this code is that it is having the wrong optimizer. The code is having Adam optimizer instead of SGD optimizer.

In your given solution Adam optimizer is been used. Actually, it is best suited for large problems, large in terms of data. In your given question you just have to convert Celsius into Fahrenheit using machine learning.

SGD accepts learning rate decay. When training a model, it is often recommended to lower the learning rate as the training progresses. The function returns the decayed learning rate.


Instead of using Adam optimizer use SGD optimizer as it maintains a learning rate(termed alpha) for all weight updates and it will not change the learning rate during training.

Refer to the following code:

const tf = require("@tensorflow/tfjs-node")

const es=500;

function convertTemp(cel){

 return (cel*1.8)+32 // Convert celsius to fahrenheit


let celsiusTemp= []

let fahrenheitTemp = []

for (let in = 0; in < 100; in++) {

 var value = 100; // Keeping this only value to ensure that Tf knows the answer

 celsiusTemp.push(in) // Shape [20,1]

 fahrenheitTemp.push(convert(in)) // Push the answer (212) to the fahrenheit


const trainingMachine=async(q,w) => {

 const models=tf.sequential();



 await models.fit(q,w,{epochs:es})

 return models;


const prediction =(models,a) => {

 const predictedValue =models.prediction(tf.tensor2d([a],[1,1]));

 return predictedValue;


const q= tf.tensor2d(celsiusTemp.slice(0,15),[15,1]);

const w= tf.tensor2d(fahrenheitTemp.slice(0,15),[15,1]);

(async () => {

 let trainedMachine = await trainingMachine(q,w);

 for (let a of [4,6,12]) {

   let predictedValue=prediction(trainedMachine, a).dataSync();

   console.log(`Value: ${a} Predicted: ${predictedValue [0]}`)