Picture1 (1)

Using Data Analytics to Improve Cloud Security- a comparative analysis.

Today, technological advancements and digitalization of the business world have taken place at an unanticipated pace. Unfortunately, this technology and digitalization have come with the increased risk of cyber threats. Big data analytics is considered to be the perfect solution to protect organizations and their data from cyber-attacks. Cyber-attack criminals are using sophisticated methods and tools to attack companies. These companies are mostly those that have been in operation for many years. Companies all over the world have applied various strategies to prevent cyber-attacks. However, the main challenges facing this endeavor have been the large volume of data and scalability. As a result, there is a need to come up with more effective strategies to be safe against cyber-attacks. In other words, companies must rethink how they respond to cybersecurity threats. This paper looks at how we can use data analytics to improve cloud security.

Using Data Analytics to Improve Cloud Security

In simple terms, big data may be defined as the large-scale data analysis and management technologies that are beyond the ability of the conventional techniques used to process data. There exists a difference between big data and traditional technologies. This differentiation can be done in three significant ways. First, big data and traditional technologies are different in terms of the amount (or volume) of data. Second, the two are different in terms of the rates of data transmission and generation (velocity). Lastly, big data is different from the other conventional technologies in terms of the types of unstructured and structured data (variety).

Today, people create approximately 2.5 quintillion bytes of data daily. There has been an increase in the rates of data creation, and this has seen the creation of 90% of the data in the past two years alone. As a result of the accelerated production of data and information, there has been a need to create newer technologies to effectively analyze such massive sets of data (Big Data Working Group, 2013).

This big data can be incorporated to change security analytics by offering new opportunities and tools to leverage the large amounts of both unstructured and structured sets of data. In that regard, it is prudent to define the concept of big data analytics. In the simplest terms, it implies the entire process of mining and analyzing big sets of data. Big data analytics may bring forth business and operational knowledge at an unexpected specificity and scale. The urgent need to critically analyze and also leverage the trend data obtained by enterprises is among the major driving forces when analyzing big data. There have been technological developments in the analysis, processing, as well as storage of big sets of data. For example, in recent years, there has been rapidly declining costs of data storage, including CPU power. The cost-effectiveness and flexibility of data centers, as well as cloud computing for elastic storage and computation, have also witnessed immense advancements.

There has also been the introduction and development of modern frameworks, including Hadoop. These new frameworks have enabled the users of data to take full advantage of the shared computing mechanisms storing large amounts of data via parallel and flexible processing. These advancements have resulted in the difference witnessed between big data analytics and traditional analytics.

Today, big data analysis can be used to address the various challenges facing cybersecurity. There has been growth in the complexity of IT networks. As a result, there has also been fast growth in the level of complexity and inventiveness of cybersecurity attacks, as well as threats. For example, between June and November 2016, close to 1 billion malware-related events took place. The estimated total cost of cyber crimes is up to the tune of $1 billion. Also, 99% of all computers in the world are prone to cyber threats and attacks. The image below shows the total costs of cyber-related crimes in selected seven countries:

There have been numerous efforts to combat cybersecurity threats and risks. As the malware threats and attacks continue to rise both in complexity and volume, it has become more challenging for the traditional analytical infrastructure and tools to efficiently keep up. The first challenge to combating cybersecurity threats has to do with data volumes or quantity. For example, on a single day at SophosLabs, about 300,000 new files that are potentially harmful that need to be analyzed are reported. The other challenge revolves around scalability. SQL based infrastructure and tooling do not scale well, and it is also expensive to maintain.

Data analytics is considered to be the perfect path to achieve cybersecurity. A company has to protect itself from all kinds of threats or cyber-attacks. However, a potential attacker only requires a single successful try. With such odds, a company cannot just attempt to prevent cyber-attacks from occurring. It is paramount to detect as well as respond to threats fast effectively. This is known as the PDR paradigm implying to the Prevention, Detection, and Responding to threats. This is where the concept and application of data analytics come in. Organizations and other key analyst companies have recognized that various issues can be easily overcome through the use of data analytics. Analyst companies have continued to write reports as well as advice their various clients about the effect of big data analytics on cybersecurity in various industries. For example, the CDC states that cloud and also big data analytics can keep off the various cyber threats that target health institutions. Companies and businesses are being actively involved and investing heavily in combating data breaches. For example, companies are identifying the anomalies in device behavior, network, including any abnormalities in contractor and employee behavior. Companies are also assessing network vulnerabilities and even risks.

Big data has been significantly changing the general analytics environment. More precisely, data analytics may be carefully leveraged to enhance situational awareness as well as information security. For example, data analytics may be incorporated to analyze the log files, analyze financial transactions, as well as analyze network anomalies to identify any defects or suspicious activities. It is also possible to use data analytics to correlate various data sources into a more coherent view. Big data operationalization has multiple benefits. This implies that just detecting the potential risks is never enough. PDF approach translates to preventing, detecting, and responding to threats. However, the real value of big data emanates from driving actions from the business teams. One needs operationalization capability that can easily sift through the data, identify the existing right signals, and then initiate the most appropriate move (Datameer, 2018).

Big data is significantly improving cybersecurity. Big data and analytics have shown great promise towards the effectiveness of cybersecurity. For example, according to 90 percent of the respondents from MeriTalk’s new US government study, there has been a significant decline in the number and incidence of security breaches. Also, 84 percent of the participants stated that they had applied big data to prevent cyber-attacks successfully. Keeping up with the volume of data has been a vital concern. However, there exist various challenges as the new cybersecurity threats keep popping up daily. Some of these challenges have to do with an overwhelming volume of data, lack of the right systems, as well as stale data by the time it reaches the cybersecurity manager.

If big data is poorly mined for purposes of improving cybersecurity, it can be ineffective for threat analysis. The metadata may be available, buts it may prove difficult to obtain maximum benefits from it. As a result, the problem could be identifying the right people who are well-versed with mining data for trends. Cybersecurity requires actionable intelligence as well as the risk management that is more prevalent in big data analysis. As such, it is advisable to have the necessary tools that can effectively analyze large sets of data. However, the secret lies in automating various tasks. This automation will ensure that any required data is readily available, and also, the required analysis is dispatched to the right individuals early enough. This, in turn, will enable data analysts to conveniently classify the cyber threats and risks without the usual extensive delays that might make the data in question irrelevant to the existing attacks. (SentinelOne, 2016).

The business world has witnessed massive digitalization. However, this digitalization has come with the increased risk of cyber-attacks. The good news is that big data analysis may be used to offer the required protection against a wide range of cyber-attacks. There have been highly complicated attack methods applied by cybercriminals and a growing role of the malicious insiders in some of the recent incidences of a security breach. This is a clear indication that the conventional approaches to ensuring information security are no longer effective and cannot, therefore, keep up. Companies, therefore, must rethink their cybersecurity approaches and concepts. Analytics is considered to be a pivotal element to leverage cyber resilience. This rethinking is necessary based on the increasingly advanced and also persistent attacks. At the core of big data analytics is improved detection. Detection is the starting point to effectively deal with cyber threats and attacks (BiSurvey.com, 2020).

The data-guided information security can be traced back to the detection of bank frauds and the anomaly-based tampering detection systems. Today, the detection of fraud is the most common use of data analytic methods. For many decades, credit card firms have rolled out fraud detection strategies. Unfortunately, the customized mechanisms used to extract big data for purposes of detecting fraud was not adequately economical to perfectly adapt to other fraud detecting applications. Today, off the counter, big data techniques and tools are mainly focusing on analytics for purposes of fraud detection in insurance, healthcare, among other areas.

A few years ago, it was difficult to analyze system events or even the logs for forensics. It was also a challenge to detect intrusion. There are various reasons why conventional approaches fail to deliver the necessary tools to fully support large scale and long-term data analysis. First, the storing or retaining of huge data quantities were not feasible in economic terms. Therefore, a lot of event logs, as well as other recorded computer activities, were easily deleted and lost after a certain fixed duration. Second, the performing of some complex queries or even analytics on huge and structured sets of data was highly inefficient. This was mainly because the traditional tools never leveraged on the big data technologies. Third, the various traditional data analysis tools were not adequately for analyzing and also managing unstructured sets of data. Therefore, the traditional analysis tools presented rigid and defined schemas. The big data tools, such as regular expressions and pig Latin scripts, can be used to query data in some of the most flexible formats. Lastly, big data systems usually incorporate various cluster computing infrastructure. As such, the systems remain more available and reliable. The systems also offer a guarantee that all the queries in the specific system have been processed adequately to full completion.

The analysis and storage of large and heterogeneous data sets are happening at an unexpected speed and also scale. This has been made possible by the new big data analysis technologies, for instance, databases that are related to the Hadoop ecosystem. These different technologies will, in turn, transform the security analytics in various ways. For example, there will be a transformation in the collection of data on a large scale from multiple internal company sources, as well as externally, for instance, the vulnerability database. There will be a transformation in the performing of more in-depth analytics on various data. There will be a more consolidated perspective of security-related information. Lastly, it will be possible to achieve real-time analyzing of streaming sets of data. It is, however, crucial to note that big data analytics still need system architects, as well as analysts. This will make it possible to obtain a more profound understanding of their existing system, to effectively configure the tools of data analysis.

There exist various ways in which we can use big data analytics to enhance security. The first use is network security. Today companies such as Zions Ban corporation are using the Hadoop clusters and other business intelligence mechanisms to quickly analyze more data in contrast to the conventional SIEM tools. In the company’s experience, the amount of data, as well as the frequency analysis of various events, are excessive for the conventional SIEMS to effectively handle alone. For instance, when using the traditional systems, it would take between 20 minutes to one hour to search from a month’s data load. However, by using the new Hadoop system to run queries with Hive, similar results can be obtained in approximately a minute. The security information warehouse that drives the implementation has various benefits for users. The users can extract useful and relevant security-related data from diverse sources such as security devices and also firewalls. The users can also extract information from business processes, website traffic, as well as from other daily transactions. This introduction of many disparate data sets and unstructured data into a single analytical framework is among the major promises of big sets of data.

Big data analytics may also be widely used for enterprise events analytics. Today, an enterprise will routinely collect enormous amounts of security-relevant data, such as people section events, various network events, or even software application events, for multiple reasons, such as the need for post hoc forensics analysis and regulatory compliance. Sadly, such a high data volume can potentially overwhelm the enterprise. An enterprise can hardly store the data, leave alone use it to do anything useful. For instance, it is projected that a large business enterprise such as HP can produce about 1 trillion events each day. This translates to approximately 12 million events every second. Those numbers are expected to grow as the enterprise runs more software, deploys more devices, hires more employees, or even enables event logging in more data sources.

The existing data analytical strategies cannot function effectively at this large scale, and the result will be a lot of false positives that their overall efficacy will be undermined. This issue will worsen as the enterprise moves to cloud architecture and continue collecting much more data. This will have a negative impact because as more data is collected, the data will lead to less actionable information. Recently, there has been researching at HP, whose goal is to move towards a situation n where more data results in better analytics as well as more actionable information. To achieve this, systems, as well as algorithms, have to be structured and also implemented to easily identify any actionable security-related information from the vast data sets. As a result, the false-positive rates will be lowered to levels that are easily manageable. In this situation, collecting more data will translate to more value from such data. However, it will be crucial first to solve multiple challenges and then realize the real capability of big data analytics. These common challenges include privacy, legal, and other technical matters regarding scalable data visualization, analysis, storage, transport, or collection. Despite the various drawbacks, the team at the HP lab has managed to address multiple big data analytics for security issues. Other enterprises can, therefore, borrow from the efforts of HP to use big data analytics for enterprise events analytics. This will, in turn, translate to enhanced security.

Big data analytics can also be used for advanced persistent threat detection. An advanced persistent threat refers to a targeted attack against any physical system or an asset of high value. Compared to the mass spreading risky malware such as trojans, viruses, or worms, the APT cyber-attackers will work in a low and slow mechanism. Low mode maintains a low profile in the network. On the other hand, the slow mode provides for a long execution time. The APT attackers avoid triggering alerts by leveraging stolen user credentials or even zero-day exploits. As a result, this kind of attack can happen over a long period, while the target enterprise is still unaware.

APTs are some of the most severe threats to information security that companies face today. The basic objective of the APT is to steal the IP from a target company. The APT will then gain complete access to confidential and sensitive user data or even access some of the strategic business data that may be later used for illegal insider trading, data poisoning, embarrassment, blackmail, financial gain, or also disrupting the company’s business. APTs are mostly utilized by motivated, well-financed, and highly skilled cyber-attackers who target sensitive data from specific enterprises. Today APTs ere becoming more advanced and sophisticated in both the technologies and methods used. This is particularly their ability to use the employees in the target organization to anonymously penetrate the existing IT mechanisms by using various social engineering strategies. The users will often be tricked to open a spear-phishing message that is customized for each target victim, such as PUSH messages, SMS, and emails. The attackers will then download and install a specially designed malware that might contain zero-day exploits.

The effective detection of threats heavily relies on the knowledge and expertise of the human data analysts to build secure, customized signatures and also conduct manual investigations. The process is not scalable, hard to generalize, as well as labor-intensive. Big data analytics is a practical approach to detecting APTs. However, there exists a problem in the form of the massive amounts of data that must be sifted in the search for any anomalies. This data is usually extracted from diverse ever-rising sources of information that must be audited first. This process makes the detection task more difficult. Based on the large volumes of data, the conventional network perimeter defense system can end up being ineffective in the detection of targeted cyber-attacks. This is because such conventional systems are not easily scalable to the enterprise networks that are ever-increasing in size. As a result, there is a need for a new and more effective approach. Most organizations collect their data relating to user hosts’ and users’ activity within an organization’s existing network, as logged by VPN users, intrusion detection systems, domain controllers, web proxies, and firewalls.

Technology has had immense benefits, such as the digitalization of the business world. Despite the various benefits, companies are still facing a significant risk of cyber-attacks. For example, companies have suffered the immense loss of data at the hands of cyber-attack criminals. To solve this problem, companies have turned to big data analytics. Today, there has been a growing adoption of mobile and cloud services. As a result, there has been the emergence of more sophisticated tools and methods used by modern cybercriminals. For many years, companies have relied on traditional tools, but they have proven ineffective. This calls for an urgent rethinking of the concepts that companies have on cybersecurity. Companies and businesses must move past the pure prevention approach and employ the PRD strategy. The PDR paradigm entails preventing, detecting, and responding.  By using big data analytics, it will be possible to improve cloud security.


Big Data Working Group. (2013). Big data analytics for security intelligence. Cloud Security Alliance, 1-22. Retrieved from https://downloads.cloudsecurityalliance.org/initiatives/bdwg/Big_Data_Analytics_for_Security_Intelligence.pdf

BiSurvey.com (2020). Big Data Security Analytics: A Weapon Against Rising Cyber Security Attacks? Retrieved from https://bi-survey.com/big-data-security-analytics

Datameer (2018). Challenges to Cyber Security and how Big Data Analytics Can Help. Retrieved from https://www.datameer.com/blog/challenges-to-cyber-security-and-how-big-data-analytics-can-help/

SentinelOne (2016). How Big Data is Improving Cyber Security. Retrieved from https://www.csoonline.com/article/3139923/how-big-data-is-improving-cyber-security.html


Discussion on why Azure is better ?

Why Microsoft Azure?

Notes on : A brief discussion in comparison of Cloud Services

Discussion Team Member: Mehedi Hassan, Klaudia Zmij, Kubanychbek Azhygulov,

Bharath Raveendra, Sonika Sitaula

Published On: April 1, 2021

Overview of the service

    Microsoft Azure is a cloud computing service that works similarly to Amazon Web Services (AWS) and Google’s cloud platform. The Microsoft Azure Platform is the main component of Microsoft cloud computing services. From a technical perspective, the Windows Azure Platform provides a programming model designed to create scalable and available applications; it also provides mechanisms to increase and decrease the computing resources of applications and service2 (G. Tajadod,2012).

The Microsoft Azure provides a catalog of hundreds of different services you can use, including full virtual machines, databases, file storage, backup copies and services for mobile and web applications.

  Microsoft announced its plans to create it in October 2008, and in early 2009, the platform began working in beta testing. Since then, it has been constantly evolving, adding both infrastructure and application capabilities. In 2008, Ray Ozzie, Chief Software Architect for Microsoft announced the imminent release of Windows Azure (later change to just Azure). Azure was originally designed to serve as a SaaS platform for several Microsoft products, but quickly morphed into a cloud services provider modeled after Amazon1 (R. Kelley,2020).  It enables quick build, deployment and management of applications across the global network of Microsoft datacenters, as well as easy application scaling and support to any chosen language, framework or tool for application development3 (B. Di Martino,2014).

The service was presented as a cloud platform that allows you to develop and run applications, store data on servers that are distributed in the data centers of the corporation. All data centers are interconnected and form the global Azure infrastructure. The principle of operation of Azure is to deploy a remote virtual machine for each instance of an application or service. The user determines the required amount of data storage and the amount of computing power that will be required to complete the task, and then connects to the platform. The convenience of this model of providing platform services is that the initial volume of consumed resources, as the task is completed, can decrease, and this, in turn, will save on payment for the resources used.

Windows Azure was created and initially developed as a system that implements the cloud-based PaaS (Platform as a Service) scheme. Unlike the IaaS scheme, which actually provides the customer with only the possibility of renting a virtualized infrastructure, PaaS also includes a full set of development and execution tools for applied solutions (operating environment, necessary frameworks, infrastructure and application services).

What does the service provide

Microsoft announced the release of its cloud computing platform called Microsoft Azure in October 2008, although it was available commercially in February 2008. It was also called Windows Azure back then. This platform offers all the three types of cloud services: software as a service (SaaS), platform as a service (PaaS) and infrastructure as a service (IaaS). It has over 600 services4 (Microsoft Azure, 2021). It also supports various operating systems, databases and many tools assisting the developers. There has been an exponential increase in the demand for Azure services, which is highlighted by the fact that most of Fortune 500 companies (almost 90%) make use of Azure5 (Jain. N., 2019). Below, we have tried to give a brief explanation of the services which are most popular6 (Loughman. K., 2019)

Application Services by Azure

Azure application services assist you develop, deploy, and maintain varied applications on the Azure platform. These embrace Azure AI, Azure Analytics, Azure IoT, Azure Active Directory (AD), Azure Media Services, and Azure hardware.

The most significant among these are:

Azure AI

Microsoft Azure allows your AI groups to get fast access to AI services and helps you interact with customers, empower your staff, and optimize business processes across the organization. Your developers will produce interactive applications on Azure’s AI Platform that offers AI capabilities like IP, speech recognition, voice synthesis, face detection, video compartmentalization, and text analytics. In addition, Azure Bot service delivers intelligent agents on to several platforms like skype, FB messenger, a website, etc., so that one can make use of it to produce intelligent chat applications.

Azure Analytics

Azure analytics can be used to form intelligent vital business choices by processing and analyzing massive amounts of data. Data engineers and data scientists can work on analyzing real-time data in a large quantity using real-time interactive workspace provided by this analytics service.

Azure Internet of Things (IoT)

This service helps an organization to create IoT solutions without having to wait on anything. Pre-configured solutions are offered by Azure’s IoT Central (SaaS). You can develop custom solutions needed for your business using “Azure IoT solution accelerators”. Azure IoT Hub (PaaS) is the core which supports Azure IoT Central and Azure IoT solution accelerators. Also, one can integrate the IoT devices using several SDKs and integration support provided by Azure IoT.

Data Services by Azure

All the storing and managing of data is done on cloud. There are many data services offered by Azure. Azure Storage, Redis Cache, Azure SQL Database, StorSimple and Azure DocumentDB.

Azure Storage

Structured and unstructured data can be stored using Azure data storage services. It is both scalable and secure. Azure storage offers guaranteed safe integration of your on-premises information with the cloud data. One can store all types and sizes of data such as blob, file, queue, NoSQL and from small blob to a big file. This can be done as per the business needs and that too without any effect to the production environment.

Development Services by Azure


One can use various development services and tools provided by Azure, in order to improve the development and deploying of the software. One of the more popular one is Azure DevOps.

Azure DevOps

Azure DevOps automates your software system delivery method and enhances your software system development by offering an alternative to the self-managed CI/CD servers and open supply DevOps tools. Microsoft Azure DevOps offers fast and trustable tools to help you in a hassle-free delivery.

Compute Services by Azure


Application workloads are hosted and are run using the Azure compute services. These hosting services are Azure Container Service, Azure VMs, Azure Batch, Azure Service Fabric and Azure App Services.

Azure Virtual Machines (VMs)

This is a very scalable resource and highly in-demand. The cloud administrators just need to choose an OS, configure the resources needed and create the web server. There is no need for the user to purchase any hardware and maintain it.

Azure Container Service

One can take advantage of the modern container-based development practices and micro services architecture, using Azure. Azure Service fabric can be used in migration of .NET applications to micro services. Also, there is a provision to store and deploy images using either Docker Hub or Azure Container Registry. The most important advantage is that it takes less space than Virtual Machines, starting instantly and accelerating the process.

Network Services by Azure

These are the services which carry out networking operations inside Azure and between on-premises infrastructure and Azure. These comprise Azure ExpressRoute, Azure Traffic Manager, Azure Virtual Network, Azure-provided DNS, and Azure Content Delivery Network (CDN).

Azure Content Delivery Network (CDN)

Azure CDN offers high bandwidth content to distribute huge files. This results in the file download time decreasing. Latency and the application load are also reduced by alleviating it from the entire content delivery process. The file is cached to the nearest (to the user geographical location) Azure Data center.

Azure Pricing Overview

Azure as a leading enterprise-grade cloud computing provider offers highly available applications and APIs with no upfront cost, no termination fees, and customers are only paying for what they use. Azure explains that standard bandwidth charges apply for moving data in and out of Azure as well as moving the data between other Azure data centers.

Thus, as for right now transferring data in is free as well as the data transfer within the same availability zone, and from Azure origin to Azure CDN. However, starting from July 2021, the free data transfer between various Virtual machines across the zones will convert into a billed activity. The data transfer for intra continental clients starts from $0.02 per GB for regions within North America and Europe, while Asia, Oceania, and the Middle East along with Africa is billed already at $0.08 per GB. The most expensive data transfer is billed for customers in South America with the bill rate at  $0.16 per GB.

Moving forward, in terms of internet egress, the pricing varies if its routed via Microsoft Premium Global Network or via preferred ISP network routing. Azure starts billing for any traffic over 5GB/month, and prices depend on the source continent as well as the data size. With that said, client is paying less with more data routed, wherein clients in North America for data between 5GB-10TB/ a month will pay $0.0875 per GB and for data within 150-500TB range is billed at $0.05 per GB. Internet egress routed via preferred transit ISP network runs slightly cheaper with bill rate for the lower their data transfer of  5GB-10TB/ a month being billed $0.08 per GB, and the 150TB-500TB/Month* being billed currently at $0.04 per GB9 (“Pricing- Bandwidth”, 2021).

What is more, the pricing of the virtual machines also varies depending on their CPU performance and disk capacity. The lower tier instances that run from 1 core to 8 core processors with the temporary storage of 20GB up to 2040GB are priced starting at $0.02/hour for the A0 instances with 1 core, 0.75GB RAM, 20GB storage, while A4 instance with 8 core processors, 14GB RAM and the storage of 2040 GB is priced at $0.64/hour. Azure offers now faster machines that are prepared for heavier workloads that are suggested for development purposes, servers, and code repositories. Again, the latest A-Series generations run its instances from 1 to 8 core processors, while the lowest A1v2  2GB RAM, 1 core option with the temporary storage is priced at $0.065/hour.  Nevertheless, Azure offers other memory-intensive virtual machines for larger databases, server farms, and other heavy workload applications. This includes D and E instances, while D instances in its upgraded versions provide from 2 up to 20 cores processors with RAM capacity starting from 14GB RAM up to 256GB RAM with the temporary storage up to 1000GB, E instances reach up to 64 vCPUs with up to 423GB RAM and 1600 GB temporary memory8 (“Pricing- Cloud”, 2021). Lastly, Azure offers its G- Series Virtual Machines which are mostly dedicated to supporting large relational database servers and its workloads while providing unparalleled computing. All in all, Azure has wide range of Virtual machines that can be easily customized depending on the client’s needs and location. As  in Azure client is paying for only used resources, thus, scaling the cloud environment both way, up and down is  both easy and cost-effective.

When it comes to other services and products that Azure offers, each service is priced and billed individually on top of previously selected options. However, again the pricing itself in similarity to the virtual machines is based on package and usage. For instance, Azure Blockchain Service offers two plans, basic dedicated for the development and testing environment and a standard that serves the production workloads. Azure explains that  the basic option with 1 vCore equipped with the consortium governance is charged per transaction node and validator done $0.0996 per hour, while the blockchain store rates at $0.05 per GB per moth with additional blockchain data manager fee at $0.0001 per transaction7 (“Pricing- Blockchain”, 2021) . As Azure pricing is quite complex to understand as every product is priced separately while clients are billed based on the usage  Azure also provides on their website a tool, Pricing Calculator, that helps to estimate the actual cost of the service.

Azure itself offers countless clouds services that are competitive in price with other leaders in the market such as AWS or GCP. For instance, both Azure and AWS offer free introductory tiers with lower usage limits and both provide pas as you go model as well as hourly charges. However, when it comes to the actual virtual machines the smallest instances offering 2 virtual CPUs with 8GB RAM run slightly cheaper at AWS with estimated monthly cost of $69 per month and Azure $70 per month. Also larger instance available is as well more expensive at Azure in comparison to AWS, wherein, 3.84 TB RAM and 128 vCPUs being priced in AWS for $3.95 per hour and Azure 6.79/hour10 (“AWS vs Azure”, 2021). Nevertheless, Azure offer more affordable storage prices in comparison to AWS, as the starting princes AWS markes at $0.023 per GB while Azure is offering storage starting at $0.0184. However, to determine better pricing options across any cloud service it is crucial to take in count all parts of the cloud service package as each service, machine, product and storage will have unique usage that will later determine the overall cost of the cloud deployment.

How long has this been around

Azure was officially launched in October 2008 by Professional Developers Conference (PDC) and announced as a commercial availability in February 2010. From there, some of the other products have been transformed but not all of them. Unlike those services Azure has become a core Microsoft service since then. Originally it was called Windows Azure and later it was renamed as Microsoft Azure in July 2014. And furthermore, additional service has been updated continuously because of the service it provides. Over time, Azure has become a long way from its ‘Red Dog’ beginnings.

When Microsoft announce “Window Azure” majority of company thought as an adaptation of Window server by which Microsoft itself would operate its own data centers. The Azure operating system which was named as “Red Dog” after Silicon Valley was created by a group work of Microsoft OS experts including Dave Cutler, the founder of VMS and Windows NT.

From the past decades, Microsoft has separated “Windows” from “Azure” name. But it still has not included some of the major cloud services like Office 365, to run on Azure as it was planned before. Few of the features from Office 365 do operate on Azure but not the core services such as SharePoint, Exchange, etc. They first deny the customers to host Azure in their own data centers but after some years Microsoft released Azure Stack, an Azure appliance platform which gives all the services of Office 365.

Microsoft first officially set up Azure as a Platform-as-Service only but after looking the inflow of cash they got from the customer they launch in the Infrastructure-as-a-Service apace. Then from this very point Microsoft began offering Linux on Azure. Todays, more than half of Azure VMs are operating on Linux not Windows Server.

Below Microsoft CEO Satya Nadella, Azure has enlarged its territory including variety of Linux distribution available in virtual machines on the Azure. Right now, CentOS, CoreOS, SUSE Linux, Oracle Linux, Clear Linux, and openSUSE ate the most supportive in Azure Platform.

In June 2020 Microsoft moved its whole company to Azure services. Furthermore, SAS services will also be found through Azure marketplace. In September 2020, Microsoft announced to expand its Azure to other area including Canada Central and Australia East focusing on the availability zones to 14.

To bring the services to the light, Microsoft announced three edge computing devices for Azure Stack Edge which are Azure Stack Edge Pro with GPU, Azure Stack Edge Pro R, and Azure Stack Edge Mini R, which are the battery familiar portable Azure server that can fit in a backpack.

For most of the industry accounts, Microsoft is leveled as number two cloud vendor today. There are more than hundreds of services available on Azure currently. Microsoft operates around 54 Azure regions in the world. Azure is a core component that Microsoft is offering today.

the above picture picked from the following link : http://labs.sogeti.com/tag/azure/


Other Comparable Services

It’s a fact that Amazon holds the top position in the market, but it’s not generally an ideal decision for every consumer, at any rate. Each of the cloud service companies has its territories of specialization, which ought to be considered prior to settling on the last decision.

Microsoft Azure has some expertise in supporting programming as a help, retail organizations, and IoT. The framework has different devices for making, overseeing, and setting up sensors. It likewise offers useful assets for continuous information investigation, AI, and knowledge handling.

AWS is a go-to Cloud foundation for endeavors. Its assets are answerable for preparing numerous constant solicitations, taking care of fluctuating client loads, and scaling to various geological business sectors. The stage is domineeringly equipped towards large groups and aggressive stages.

Google Cloud stands apart with its program for new businesses. Companies enlisted in automobile or adventure assets can get an exceptional proposal for utilizing the Google Cloud framework. Along these lines, Google prepares to turn into a main startup Cloud. What’s more, the stage offers a standout amongst other AI devices available11 (Ilyukha, 2020).

Here are some of the major reasons to choose Azure;

Infrastructure as a Service (IaaS) and Platform as a Service (PaaS) capabilities:

Azure possesses an enticing combination of IaaS and PaaS services. IaaS empowers endeavors to rethink their framework on Azure and pay for what they use, and PaaS allows them to make their own web applications and arrangements without purchasing and keeping up the basic preparation. Top Azure advisors from the Microsoft accomplice local area can work with Enterprises to use Azure PaaS and IaaS to construct venture applications on the Cloud for sped up business development.

Security Offerings:

Security Development Lifecycle (SDL) is an industry driving security measure on which Azure has been planned. It involves security at its centre and private information and administrations to stay free from any and all harm on Azure Cloud. Microsoft Azure offers the best consistent inclusion of more than 50 consistent contributions and is the most confided in cloud stage by U.S. government organizations. Likewise, it is the first to accept the new global norm for Cloud protection, ISO 27018. Consequently, Microsoft ensures the best as far as wellbeing for all tasks and information on the Azure Cloud.

Scalability and Ductility:

Applications that run viably, unaffectedly and scale starting with one ten then onto the next million customers can be made with no additional coding by using Azure. Azure Storage gives moldable, ensured, secure and execution compelling limit organizations in the cloud. It gets less unpredictable to change settings and use more processors for the application to use.

Unmatched Hybrid Capabilities:

Azure has a cross variety of capacities that makes it uncommon. Azure empowers straightforward convenience and a strong unsurprising stage between on-reason and public Cloud. Azure gives a more broad extent of hybrid affiliations, including virtual private associations (VPNs), stores, content transport associations (CDNs), and Express Route relationship with improving accommodation and execution.

Integrated Environment with Other Microsoft Tools:

Companies reliant on Microsoft instruments like Office 365, Outlook and SharePoint puts resources into a cloud stage that flawlessly coordinates with Microsoft items. The utilization of Azure works on tasks by utilizing a similar virtual on-premise machine like Windows and Linux.

Analytics and Intelligence capabilities:

Azure encourages SQL and NoSQL information benefits and inherent help for delving further into information and revealing key bits of knowledge for improving business cycles and dynamic. Azure is the solitary cloud stage that offers Blockchain as a Service (BaaS), Machine Learning, Bots, and Cognitive APIs capacities.

Picking the correct Cloud merchant is a vital choice for Enterprises. Various undertakings have seen sped up business development by picking Azure as their cloud stage.

Easy learning curve:

Recognizable tools like Visual Studio, ASP.NET, and programming dialects like Visual Basic, C++, C#, and so on are utilized to create local cloud applications; it is hence simple for experts to create cloud-based endeavor applications.

Cost-Efficient Platform:

Microsoft’s pay-more only as costs arise model permits you to pay for what you use to fabricate or extend assets utilizing Azure administrations. This reduces the IT organization expenses to a base as the framework is taken into consideration by Microsoft on Azure. It interfaces data centers to the cloud easily and upholds 42 locales like no other cloud supplier.


Amalgamated applications can be created for on-premises applications, for example, cloud data set with Azure. Purplish blue guides Internet conventions and open guidelines like XML, SOAP, REST and HTTP. A product improvement pack for Java, PHP, and Ruby is accessible for applications written in those dialects and Azure devices for Eclipse.


Identity & Access Management (IAM):

Azure offers secure IAM capacities with Azure Active Directory administration to empower the right clients to get to the correct data. With Azure, Enterprises can receive and develop IAM abilities to lessen character the board costs and become more dexterous, accordingly supporting inventive business activities.


Other public cloud stages, for example, Amazon Web Services and Google Cloud Platform, additionally give comparative capacities like Microsoft Azure. Nonetheless, with the sheer speed that Microsoft is dispatching new segments in Azure, it is evident that it is scored higher in the commercial centre. Believed Microsoft accomplices, for example, Saviant, are assisting endeavours with building versatile and secure cloud local applications by exploiting the Azure stage. As an enlightening model, US-based utility specialist co-op in building a Smart Water Metering framework utilizing the full heap of Azure administrations12  (SAVIANT intelligent solutions, 2021).


  1. Ilyukha, V. (2020). Jelvix. Retrieved from https://jelvix.com/blog/aws-vs-google-cloud-vs-azure
  2. SAVIANT intelligent solutions. (2021). saviantconsulting. Retrieved from https://www.saviantconsulting.com/blog/10-reasons-why-choose-microsoft-azure.aspx
  3. Pricing – Blockchain Service: Microsoft Azure. (n.d.). Retrieved March 26, 2021, from https://azure.microsoft.com/en-us/pricing/details/blockchain-service/
  4. Pricing – Cloud Services: Microsoft Azure. (n.d.). Retrieved March 24, 2021, from https://azure.microsoft.com/en-us/pricing/details/cloud-services/.
  5. Pricing – BANDWIDTH: Microsoft Azure. (n.d.). Retrieved March 24, 2021, from https://azure.microsoft.com/en-us/pricing/details/bandwidth/
  6. R. Kelley, A. D. Antu, A. Kumar and B. Xie, “Choosing the Right Compute Resources in the Cloud: An analysis of the compute services offered by Amazon, Microsoft and Google,” 2020 International Conference on Cyber-Enabled Distributed Computing and Knowledge Discovery (CyberC), Chongqing, China, 2020, pp. 214-223, doi: 10.1109/CyberC49757.2020.00042.
  7. G. Tajadod, L. Batten and K. Govinda, “Microsoft and Amazon: A comparison of approaches to cloud security,” 4th IEEE International Conference on Cloud Computing Technology and Science Proceedings, Taipei, Taiwan, 2012, pp. 539-544, doi: 10.1109/CloudCom.2012.6427581.
  8. B. Di Martino, G. Cretella, A. Esposito and R. G. Sperandeo, “Semantic Representation of Cloud Services: A Case Study for Microsoft Windows Azure,” 2014 International Conference on Intelligent Networking and Collaborative Systems, Salerno, Italy, 2014, pp. 647-652, doi: 10.1109/INCoS.2014.76.
  9. Microsoft Azure. (2021). Retrieved from https://azure.microsoft.com/en-us/services/
  10. Jain, N. (2019). List of Top 10 Azure Services [Blog]. Retrieved from https://www.whizlabs.com/blog/top-azure-services/
  11. Loughman, K. (2019). An Overview of Microsoft Azure Services [Blog]. Retrieved from https://victorops.com/blog/microsoft-azure-services-overview

Healthy DIET for Healthy & Fit Pregnancy

iet Advice during pregnancy

According to the American College of Obstetricians and Gynecologists, a woman will need more calcium, folic acid, iron and protein during pregnancy to support the healthy development of the growing baby.



Developing baby needs calcium to build healthy bones and teeth. Calcium also helps a baby grow a healthy heart. Vitamin D will also be required as it aids the absorption of calcium from the stomach.
Food that is rich in calcium included salmon, broccoli, kale, and yogurt.


Iron is essential to make hemoglobin in the red blood cell. During pregnancy, the amount of blood in your body increases by about 50% to meet the needs of the healthy development of the growing baby. You will, therefore, need extra iron to make more hemoglobin.
Getting too little iron during pregnancy can lead to anemia, a common problem during pregnancy which can result in fatigue and an increased risk of infections.
Iron is also essential for a healthy immune system.
Red meat is rich in iron.

Folic acid

Folic acid is also known as folate. It is a B vitamin that is crucial in helping to prevent birth defects in the baby’s brain and spine which is also known as neural tube defects.
Most gynecologists recommend that pregnant women take a daily vitamin supplement containing 600 micrograms of folic acid, an amount commonly found in a regular prenatal vitamin.
Food that is rich in folic acid includes leafy green vegetables, fortified or enriched cereals, bread, and pasta.


More protein is needed during pregnancy. Protein is the building blocks of our body’s cell. It is especially important to have enough protein in second and third trimester as the baby is growing faster during these periods.
Food that is rich in protein includes meat, poultry, fish, dried beans and peas, eggs, nuts, tofu.

Fruits and vegetables
What food should you eat during pregnancy?

Fruits and vegetables are nutrient-dense and are filled with fiber, vitamins, and minerals that you and your baby need. Eating fruits and vegetables also help prevent constipation, a common symptom during pregnancy.
Fruits that are beneficial included: avocado, lemon, banana, apples, and berries.
Vegetables like sweet potato, broccoli, beans and other green, leafy vegetables are all beneficial to the pregnant women.

Lean protein

Pregnant women should include good protein sources at every meal to support the baby’s growth. Source of the lean protein includes meat, poultry, fish, hard-boiled eggs, beans, tofu, cheese, and nuts.

Whole grains

Whole grains are a vital source of energy in the diet, and they also provide fiber, iron, and B-vitamins.


What food should you avoid during pregnancy?

Raw fish


Do not eat any undercooked or raw fish as they may contain parasites or bacteria.

Fish that has a high level of mercury
Pregnant women should avoid fishes that contain a high level of mercury such as king mackerel, marlin, shark, swordfish, tilefish, and tuna. Mercury consumed during pregnancy has been linked to developmental delays and brain damage in the baby.


It is best to avoid raw shellfish during pregnancy. Fresh shellfish like clam, scallop, and oyster contain Vibrio bacteria.

Smoked seafood

Refrigerated and smoked seafood should be avoided because it could be contaminated with listeria.
Undercooked or raw meat
Raw or undercooked beef or poultry should be avoided during pregnancy because of the risk of contamination with coliform bacteria, toxoplasmosis, and salmonella.

Soft cheese

Soft cheese made from unpasteurized milk may contain the virus E.coli or Listeria.

Unpasteurized milk

Pregnant women should avoid unpasteurized milk as it contains bacteria such as Campylobacter, E. coli, Listeria, or Salmonella.
Pregnant women should be drinking pasteurized milk.

Raw egg
Pregnant women should avoid unpasteurized or undercooked eggs as they may contain Salmonella.

Coffee, tea, chocolate and some soft drinks have a high level of caffeine. Pregnant women should limit their consumption to one or two cups in a day.
According to one study published by the National Institute of Health, high maternal caffeine intake during pregnancy is associated with the risk of low birth weight in babies.

Pregnant women should stay away from alcohol as it increases the risk of miscarriage and stillbirth. Many studies have confirmed that even a small amount can negatively impact the brain development of your baby.
It can also cause fetal alcohol syndrome. This syndrome involves facial deformities, heart defects, and mental retardation.

22584223 - 3d rendered medical illustration - wrong sitting posture

Occupational Overuse Syndrome (OOS)

Occupational Overuse Syndrome

OOS is mainly a collective form of the conditions that occur due to repeated movements of muscles or some particular posture. Thereby, it is quite common in people required to work in prolonged periods of immobility, like using computers at work.

Warning Signs!

when at work, you could be suffering from OOS with these kind of feature:

Pain and stiffness of a joint

Difficulty in moving the joint

Swelling in the forearm, wrist, shoulder

Muscle weakness

Discomfort in the neck or shoulders


In the beginning, it is not painful, as the symptoms come and go. Also, discontinuing the task tends to improve the symptoms. But, continuous use can cause severe back and neck pain and make simple tasks like holding a glass of water or opening the door look difficult.

Not just a minor pain!

Although all OOS begin with mild pain, what may appear minor discomfort at the beginning could later lead to:

Tiny muscle tears

Nerve damage due to compression

Abnormal posture

Abnormal muscle movement

Premature degenerative changes

Are you at risk?

If your job entails too much of the following activities, you are at an increased risk of OOS:


Process work such as assembly line and packing


Manual work

Computer work

How can you prevent OOS?

Here are some easy ways to help you avoid OOS:

Take regular stretching and walking breaks every half an hour.

Deep breathe or meditate for 5 minutes to relax.

Watch your posture and remain active.

Utilize all your tea, meal breaks, and eat a nutritious and balanced diet.

Try to take a break from the tasks that involve repetitive movements.

A healthy lifestyle includes regular exercise, a balanced diet, and a sound sleep pattern. However, most of us seldom make time to abide by this golden rule. To enjoy the bliss of a healthy life, we need to remind ourselves of the long-term benefits of these corrective measures. The experience will be much more comfortable and happy!
drmehedi.com (1)

Kidney Diseases why and how?

Our kidneys are two bean formed organs arranged in the focal point of our back just beneath the rib confine or more the midsection.
Solid kidneys work to evacuate squander side-effects of assimilation and strong movement from our blood; balance water and centralization of minerals in our body; combine the compound renin required to keep up ideal circulatory strain and erythropoietin which thusly animates the creation of erythrocytes.
Kidneys additionally help to keep up ideal bone wellbeing by incorporating a functioning type of nutrient D. Proceeded with loss of renal capacity over some undefined time frame which may run from months to years is alluded to as kidney illness.
Kidney ailment happens because of hereditary issues, some injury, or overcompensation to certain medications. Individuals with diabetes, hypertension, or hereditary inclination are at more serious dangers of creating it. Proceeded with loss of kidney working is additionally achieved by the sickness glomerulonephritis. Kidney stones, prostate infection, pinworm invasion of kidney, vasculitis, polycystic kidney ailment, and hemolytic-uremic disorder are likewise the causative elements of kidney malady. Marathon runners who don’t focus on liquid utilization may create intense renal disappointment that harms their kidneys.
Side effects: 
The accompanying signs and side effects are markers of improvement of a kidney problem: Symptoms of Kidney Disease – Nausea
Changes in pee: Frequency of pee increments or diminishes; pee may have blood; the desperation to pee around evening time builds; pee is frothy and there is torment or trouble in peeing and totally mitigating the weight.
Growing in face, hands or legs.
Weakness because of adjusted creation of the hormone erythropoietin.
Epidermal rashes or tingling because of affidavit of waste material in blood.
Loss of hunger, queasiness and retching and a metallic preference for the mouth because of uremia.
Hypertension, hyperphosphatemia and hypocalcemia are found in the patient.
Sexual brokenness and fast atherosclerosis is seen.
Torment in bones or breaks.
Torment in the chest because of aggravation around the heart.
Gathering of liquid in the lungs causing windedness.
Deadness in feet or hand, a sleeping disorder and anxious leg condition.
Blood tests are never really check the levels blood urea nitrogen (BUN), creatinine, and glomerular filtration rate (GFR). In the event of a kidney infection the blood levels of BUN and creatinine are seen as high though there is a significant reduction in GFR.
Pee tests are likewise done to check the degree of protein and electrolytes just as nearness of anomalous cells. On the off chance that an individual is experiencing kidney malady minute examination of the pee would uncover undesirable clustering of red and white platelets. A correlation of levels of electrolytes in the blood and pee is utilized to choose whether the kidneys are working typically to screen and channel the blood.
Ultrasound of the mid-region is done to check the size and any block in kidneys.
Kidney biopsy is directed for minuscule investigation of kidney tissues.
GFR under 60 mL/min/1.73m2 for at least 3 months demonstrates interminable kidney ailment.
Phases of kidney illness: 
Stage 1: Kidney work is lessened just a bit. GFR is around 90 or above at this stage. Blood or pee examination shows certain markers of variations from the norm or harm. At this stage patients are required to assume responsibility for their hypertension and diabetes, practice consistently, cut down on pressure and abstain from smoking and liquor.
Stage 2: GFR diminishes and is in the middle of 60 to 89. Aside from the safety measures of stage 1 the specialists may endorse different medications to upgrade the soundness of the veins for dialysis in future.
Stage 3: GFR is between 30 to 59 and indications like paleness and bone issues become evident.
Stage 4: GFR is as low as 15 to 29 and different intricacies identified with incessant kidney illness are seen. Groundwork for hemodialysis or peritoneal dialysis or kidney transplantation starts.
Stage 5: GFR is under 15 and kidney disappointment happens. Dialysis or transplantation is compulsory.
More youthful patients who arrive at the last phases of kidney sickness are regularly at the danger of creating malignant growth.
Since it is a dynamic illness early distinguishing proof and treatment is a must to diminish inconveniences. In the underlying stages way of life changes and meds for hypertension, elevated cholesterol levels, and diabetes are endorsed to bring down the pace of movement of a kidney ailment. Angiotensin changing over chemical inhibitors (ACEIs) or angiotensin II receptor enemies (ARBs) are utilized generally to hinder movement to organize 5. In the propelled stage substitution of the hormones erythropoietin and calcitriol just as phosphate fasteners are finished. Stage 5 includes hemodialysis threefold per week or peritoneal dialysis that can be directed at home day by day. Kidney transplantation is done to place a sound kidney in an individual who has arrived at stage 5 that does the elements of the two harmed kidneys. Post transplantation dialysis isn’t required.

Why I am Diagnosed with Fatty Liver ?

The main functions of the liver are removing toxins from the body and processes food nutrients.

Fatty Liver is a condition in which excess fat is stored inside liver cells, making it harder for the liver to metabolize.

One most common cause of fat buildup in the liver is heavy alcohol use,

referred to as an alcoholic fatty liver disease.

This is a common but preventable one.

Alcoholic Fatty Liver

As the name clearly states, alcoholic fatty liver disease (AFLD) is developed by excessive alcohol intake. Unlike the brain, the liver is a very resilient organ. It is capable of immediately regenerating new cells to replace the dead ones.

Every time the liver filters alcohol, some of its cells die. Regardless of how resilient the liver is, excessive and prolonged alcohol intake reduces the liver’s ability to regenerate cells. This causes fat to buildup on the liver, resulting in eventually alcoholic fatty liver disease.

When in its first stages, this condition is easily reversible by merely quitting alcohol for a minimum of 2 weeks.

Non-Alcoholic Fatty Liver

As the name implies, Non-Alcoholic Fatty Liver Disease (NAFLD) is the result of too much fat accumulation on the liver.

There are no exact reasons why some livers accumulate fats, and the same goes for liver inflammation.

So, Non-alcoholic fatty liver disease is a condition in which the liver accumulates excess fat, not due to excessive alcohol consumption.

It’s a widespread condition in clinical practice. It affects about 20% of the world population.

We care about liver disease because it is one of the problems which can cause liver failure, liver cancer, and the consequences may need liver transplantation.

The accumulation or store of extra fat in the liver can cause chronic irritation of the liver cells and subsequently cause further liver damage.

There are 03 common causes of fatty liver disease,

  1. being overweight,
  2. patients with diabetes,
  3. with high cholesterol.

Keeping in mind some people with healthy body weight can also have fatty liver disease.

The increasing epidemics of fatty liver disease has been recently attributed to overweight and the high prevalence of diabetes. The fatty liver disease used to be a rare condition in children under 16.

However, Recently with the increasing problem of obesity in children, Fatty liver disease became affecting about 10% of children Early in the course of the disease, many patients have no symptoms.

When the scaring and the inflammation of the liver progress.

 Many patients complain about problems related to the enlargement of the liver or fatigue.

If complications happened from Fatty liver diseases such as cirrhosis or complete scaring, patients could have problems with bleeding and symptoms of other complications of liver disease such as liver cancer or fulminant hepatitis.

we do a blood test for a patient with fatty liver disease often,

where many patients will have elevated liver function tests. Imaging modalities can be beneficial for diagnoses of fatty liver disease, such as ultrasound, CT scan, MRI, or fiber scan. Occasionally, patients might need a liver biopsy to assess the significance of the damage in the liver.

Fatty liver disease in the early stage could be completely reversible. However, if the patient continues to have excess fat in the liver, this can lead to the liver’s complete scarring or what we call sclerosis.

And subsequently, liver failure and its complications such as liver cancer and the need for liver transplantation.

The fatty liver disease becomes a more common indication for liver transplantation globally.

Right now there is no specific treatment for fatty liver disease.

The most important step is to control the risk factors for liver disease;

such as high cholesterol and control of the blood sugar and lose weight for obese individuals. There are recent studies that confirm the benefit of coffee for patients with liver disease.

Some of the symptoms of the alcoholic and non-alcoholic fatty liver are:

  • Slight fatigue
  • Confusion or drowsiness
  • Weight loss with swelling in the tummy and ankles
  • Overweight and obesity
  • Insulin resistance
  • High levels of fat in the blood
  • Hyperglycemia (High Blood Pressure)

The most crucial food to avoid for patients with fatty liver disease is a diet rich in refined sugars, such as bread, rice, potatoes, and corn. We recommend that patients consume lean meat and increase the number of vegetables and salads in their diet.

They are conducting several clinical trials at Johns Hopkins Hospital and looking for a novel therapy for fatty liver disease. We’re using tablets form for specific drugs that alter the pathophysiology of fatty liver disease. What we are hoping is to prevent the accumulation of fat in the liver. It reduces the amount of scarring in the liver from fatty liver disease and eventually prevents liver cancer and liver failure.

To be healthy and Hessle free liver, stay tune with my blogs, and keep exercise/run every day morning at least 30 minutes a day.

Dr. Mehedi Hassan

For more please visit and read……….

  1. Non-alcoholic Fatty Liver Disease: A Clinical Update
  2. everydayhealth.com-liver-disease
  3. liverfoundation.org
  4. motivatorkerja.com
  5. Thedailycrisp.com