AI Back-End as a Service for Learning Switching of Mobile Apps between the Fog and the Cloud

10/02/2021
by   Dionysis Athanasopoulos, et al.
0

Given that cloud servers are usually remotely located from the devices of mobile apps, the end-users of the apps can face delays. The Fog has been introduced to augment the apps with machines located at the network edge close to the end-users. However, edge machines are usually resource constrained. Thus, the execution of online data-analytics on edge machines may not be feasible if the time complexity of the data-analytics algorithm is high. To overcome this, multiple instances of the back-end should be deployed on edge and remote machines. In this case, the research question is how the switching of the app among the instances of the back-end can be dynamically decided based on the response time of the service instances. To answer this, we contribute an AI approach that trains machine-learning models of the response time of service instances. Our approach extends a back-end as a service into an AI self-back-end as a service that self-decides at runtime the right edge/remote instance that achieves the lowest response-time. We evaluate the accuracy and the efficiency of our approach by using real-word machine-learning datasets on an existing auction app.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro