What is edge computing technology and how does it work? Explained


What is edge computing technology
pic credit: i-SCOOP


Edge computing may appear to be another one of those overused tech phrases, but it has no practical meaning. And although that is true, it can also have significant implications for machine learning on mobile devices, data, privacy, and, perhaps most shockingly, battery life. And I'm about to tell you everything about it right now.


That's all right. To answer the title of this article, edge computing is computing that takes place both locally and remotely. Play on a mobile device that has access to the data needed to do the calculation, but is still linked to a cloud server to send and receive data as needed. 





Also Read: These 9 Industries Will Be Transformed by Quantum Computing until 2030






What exactly does that imply? 


If you want to go further, check out this post from the Verge, which was the source of inspiration for our article, but in short, computing used to be something you did on a desktop computer that looked like this.


desktop computer



To put it another way, all computations or programs were done locally using the data, information, and processing power available to the computer at the time. So, when it comes to cloud computing, both data and computational resources are stored in the cloud, and we can use them in real-time to send and receive data to mobile devices? 


After all, our mobile gadgets, particularly our phones, but also computers, can only store so much data and have so much computing capacity. As a result, cloud computing provided us with real-time access to far larger systems that could train machine learning models or store data that was too huge to fit on our phones. 


However, real-time access brought with it a slew of problems, primarily latency, bandwidth, and, more importantly, privacy concerns.


If you have a smartphone or smart home device, you're undoubtedly already familiar with the concept of latency as it relates to smart assistance. When you ask someone for something and it takes them a while to respond with a genuine response. 


For example, if you ask Alexa what the weather is like today, which I don't have to worry about because I got rid of all of my Amazon products, the device must record what you just said, process it locally, and send it to a cloud server, which almost certainly requires data compression. The cloud server will then have to unpack the data and most likely process it all over again.


Talking to other cloud systems that conduct different operations on this data, then taking the information that they output and sending it back to your local device on a packet and giving you the information. And, while they do happen reasonably rapidly, given the circumstances, they still take time, which might be troublesome for certain sorts of applications. 


And one of the criteria in how long it takes is bandwidth. To put it another way, bandwidth is the quantity of data you can send in a given length of time when it comes to computers.


internet speed test



If you've ever done an internet speed test, you're measuring lead upload and download bed with of your internet or cellular connections, importantly, and as you probably noticed, I have relatively fast wifi at home, but most smart devices or mobile devices don't necessarily have that. Lost of a connection. 


So if I want to do something on a mobile device that requires cloud computing, like using a smart assistant, it takes time-based on the amount of bandwidth I have to actually perform that task. And especially for people who live in lower resource areas who don't have access to that amount of bandwidth, it can become a bit. Big problem.


A great, very timely example of that is actually online education during the COVID 19 pandemic, lots of children in low resource areas don't necessarily have a strong enough internet connection to run things like zoom all day. 


Another good example of the issues associated with cloud computing is actually autonomous cars, especially as it relates to latency. If I'm driving a car on the road and. My car has to send information back to the cloud to be processed by a machine learning model and then sent back to my car in order to tell the car to make a turn or change lanes.


We don't have the mechanisms in place to process that much data quickly enough to drive a car. However, autonomous automobiles must be connected to some form of cloud system so that driving data may be sent back to update the bigger autonomous driving model. 


Additionally, it will get updates to its present model based on data provided by others. Cloud computing, on the other hand, does have privacy consequences.


If I'm driving an autonomous car, I don't necessarily want all that. My driving history is to be sent up to a cloud server where it might get hacked and released to the public. Not that I'm. Doing anything that anyone should be worried about or in a more real-life example? 


Well, if you're using a machine learning-based healthcare system that requires you to upload sensitive data, that data is likely protected by HIPAA, at least in the United States. So if there were a way for it to be processed locally on your phone, instead of sending it to a cloud that would. 


Be much better for your own privacy in short, it would be really nice to have a way to run a lot of these computations locally on our mobile devices or on our cars while still being connected to the cloud to send and receive updates.


This is especially true in this case. As we continue to use more data-driven machine learning algorithms. That's usually saved on our local devices, but it's based on models that start in the cloud. Edge computing, in principle, overcomes this problem. 


We can execute computations on the data locally and then send updates to a cloud server. We're already doing that with most mobile devices, according to most definitions of edge computing. In actuality, however, the computations we're doing may be done by our phones, laptops, and automobiles. Particularly when it comes to massive machine learning models.


Just started to create the types of hardware that edge computing or edge AI will rely on one such example of this hardware are the new Apple M one ship and the subsequent chips that they are designed for their releases. The summer. 


Another example is actually Apple. Siri would use this federated learning to train the Siri model on your phone to recognize your voice, but it doesn't actually send those voice recordings back to the Apple cloud server.


Instead, transmit the model to that master cloud model to be merged with. All of this, crucially, lowers latency sidesteps. Most bandwidth concerns keep your data on your devices rather than in the cloud, which arguably allows for more valuable and personalized machine learning applications. 


While hearing about forthcoming and new technology is always entertaining, actually playing with them is frequently even more so.



Also Read: What are the greatest scientific discoveries that changed the world

Post a Comment (0)
Previous Post Next Post