Simple linear regression is useful for finding the relationship between two continuous variables. One is a predictor or independent variable and the other is a response or dependent variable. It looks for a statistical relationship but not a deterministic relationship. Relationship between two variables is said to be deterministic if one variable can be accurately expressed by the other. For example, using temperature in degrees Celsius it is possible to accurately predict Fahrenheit.

Today we are going to create an **Android App using TensorFlow Lite** to use the Machine Learning model of Linear Regression in Android.

**Creating a Model**

Firstly we are going to create a Linear Regression model and train it with the predefined data because we are creating a supervised model.

For our example, we are going to create Celsius to Fahrenheit converter.

import tensorflow as tf tf.logging.set_verbosity(tf.logging.ERROR) import numpy as np celsius_q=np.array([-40,-10,0,8,15,22,38,40],dtype=float) fahrenheit_a=np.array([-40,14,32,46.4,59,71.6,100.4,104],dtype=float) for i,c in enumerate(celsius_q): print("{} degree celsius = {} degree fahrenheit".format(c,fahrenheit[i])) io=tf.keras.layers.Dense(units=4,input_shape=[1]) io2=tf.keras.layers.Dense(units=4) io3=tf.keras.layers.Dense(units=3) io4=tf.keras.layers.Dense(units=1) model=tf.keras.Sequential([io,io2,io3,io4]) model.compile(loss='mean_squared_error',optimizer=tf.keras.optimizers.Adam(0.1)) hist=model.fit(celsius_q,fahrenheit_a,epochs=500,verbose=False) print('Model Training Finised')

**Now we will convert this Model into tflite file**

keras_file='cf.h5' tf.keras.models.save_model(model,keras_file) converter=tf.lite.TFLiteConverter.from_keras_model_file(keras_file) tfmodel = converter.convert() open("degree.tflite","wb").write(tfmodel)

At this time we will have a file called degree.tflite

**Android Studio**

**Create a new project and paste the degree.tflite file in assets folder.**

Open gradel.build(Module:App)

Add these lines after **BuildType**:

aaptOptions { noCompress "tflite" }

then add these lines to the **dependencies**:

compile 'org.tensorflow:tensorflow-lite:+'

And now you can sync the Gradel to install required TensorFlow files.

**MainActivity.java**

Import the TensorFlowÂ Interpreter.

import org.tensorflow.lite.Interpreter;

Define the Interpreter as tflite:

Interpreter tflite;

Now we have to load the files from the assets folder for that we call the loadModelFile.

try { tflite = new Interpreter(loadModelFile()); }catch (Exception ex){ ex.printStackTrace(); }

To load the assets folder file we have to use MappedByteBuffer.

private MappedByteBuffer loadModelFile() throws IOException{ AssetFileDescriptor fileDescriptor=this.getAssets().openFd("degree.tflite"); FileInputStream inputStream=new FileInputStream(fileDescriptor.getFileDescriptor()); FileChannel fileChannel=inputStream.getChannel(); long startOffset=fileDescriptor.getStartOffset(); long declareLength=fileDescriptor.getDeclaredLength(); return fileChannel.map(FileChannel.MapMode.READ_ONLY,startOffset,declareLength); }

We have one EditText in the app with variable name et, we will read the value and pass to get the prediction.

float prediction=doInference(et.getText().toString());

**doInference()** function has one input array which is a 1D array and ouput will be of the 2D array, so we initialize them and then **tflite.run(input, output)**, we will save the output value and return it to prediction.

private float doInference(String inputString) { float[] inputVal=new float[1]; inputVal[0]=Float.valueOf(inputString); float[][] output=new float[1][1]; tflite.run(inputVal,output); float inferredValue=output[0][0]; return inferredValue; }

Lastly, we have **TextView** named hw, we will write the prediction to the **TextView**.

hw.setText(Float.toString(prediction))

## Comments

Loading…

Loading…