info@biomedres.us   +1 (502) 904-2126   One Westbrook Corporate Center, Suite 300, Westchester, IL 60154, USA   Site Map
ISSN: 2574 -1241

Impact Factor : 0.548

  Submit Manuscript

Research ArticleOpen Access

Integrated Telemedicine System for ECG collection and Semi-Automatic Analysis Volume 59- Issue 1

Eneh Afam Samuel1*, Harmony Nnenna Nwobodo-Nzeribe2 and Okoye Japhet Okwudili3

  • 1Department of Biomedical Engineering, David Umahi Federal University of Health Science, Nigeria
  • 2Department of Computer Engineering, Enugu state University of Science and Technology Agbani, Nigeria
  • 3Department of Chemical Engineering, Enugu state University of Science and Technology Agbani, Nigeria

Received: October 01, 2024; Published: October 09,2024

*Corresponding author: Eneh Afam Samuel, Department of Biomedical Engineering, David Umahi Federal University of Health Science, Uburu, Ebonyi State, Nigeria

DOI: 10.26717/BJSTR.2024.59.009233

Abstract PDF

ABSTRACT

Significant hazards to cardiovascular health are associated with arrhythmias, or abnormal heartbeats, which frequently need to be detected early for optimal care. Conventional screening techniques may not always yield a diagnosis in a timely manner and can be laborious. The creation of a novel artificial intelligence (AI)-based tool to improve arrhythmia screening is presented in this paper. The device continually monitors and analyzes electrocardiogram (ECG) readings in real time by fusing wearable sensor technologies with sophisticated machine learning algorithms that was hosted in cloud via AWS. The AI system can accurately identify patterns and abnormalities linked to different types of arrhythmias because it was trained on a large collection of ECG recordings. The device proved through a battery of validation tests that it could identify arrhythmic events with a sensitivity and specificity higher than traditional techniques. Timely intervention is facilitated by the userfriendly interface that offers actionable insights and notifications to healthcare providers as well as patients. This approach provides a more accessible, effective, and dependable early detection technique, thereby addressing important issues in arrhythmia screening. By enabling early diagnosis and tailored treatment, the AI-powered device could enhance patient outcomes and progress the field of cardiovascular care. Subsequent investigations will concentrate on enhancing the algorithm, growing the dataset, and carrying out clinical trials to verify the device’s efficacy across a range of demographics.

Keywords: Electrocardiogram; Machine Learning; AWS; Arrhythmias

Introduction

Chronic disease (for example, cardiovascular disease, diabetes, Alzheimers) has increased, with mortality associated with a rise in coronial health disease to 66% by 2030 [1]. Telehealth is the use of digital information and communication technologies to access health care services remotely and manage your health care. Technologies can include computers and mobile devices, such as tablets and smartphones. This may be technology you use from home. Nurses or other health care professionals may provide telehealth from a medical office or mobile van, such as in rural areas. Telehealth can also be technology that your health care providers uses to improve or support health care services. It also includes virtual healthcare, which uses technology to facilitate better patient-doctor communication in clinics. Electronic communication technology helps doctors, patients, and clinics to monitor, follow up, and communicate information on care plans. This allows for full virtual engagement during medical treatment. According to the World Health Organization (WHO), we now live in a world where technology can carry out treatments, do pre-operative planning, and monitor results from a distance.

To prevent potentially fatal situations, people with chronic illnesses must be closely observed. An erratic and frequently abnormally high heart beat is the result of the cardiac ailment known as atrial fibrillation. When you're at rest, your heart rate should be regular and between 60 to 100 beats per minute. By feeling your pulse at your wrist or neck, you can determine your heart rate. Atrial fibrillation is characterized by an erratic heartbeat, which can occasionally be quite fast. It can occasionally be significantly more than 100 beats per minute. This may result in issues such as Dizziness, shortness of breath, tiredness. The heart palpitations—which can feel like it's pounding, fluttering, or beating erratically—may be apparent to you. They often last a few seconds or, in rare circumstances, a few minutes. Atrial fibrillation can occasionally have no symptoms, and the patient is totally ignorant that their heart rate is abnormal. This decade, the Internet of Things has been hailed as the beginning of a new era of interconnectedness for everyday devices everywhere. Remote health monitoring mechanisms parking management smart houses, smart cities smart environment, industrial sites, and agricultural lands [1-3]. Health and environmental conditions can be tracked using IoTs in healthcare management. The importance of IoT systems has been increased in real-time applications because of their simple structure.

IoT connects computers to the internet through sensors and networks in order to process data in real-time; [4] IoT has been used in certain significant medical science research to track patients' health [5]. proposed an intelligent health monitoring and diagnosis system based on the internet of things and fuzzy logic for cardiac arrhythmia COVID-19 patients [6]. proposed Internet of things-based health monitoring system for early detection of cardiovascular events during COVID-19 pandemic. The method used was Internet of things (IoT) and health monitoring sensors which help to improve the medical care systems by enabling latency-sensitive surveillance [7]. proposed Toward real-time and efficient cardiovascular monitoring for COVID-19 patients by 5G-enabled wearable medical devices: a deep learning approach. Real-time cardiovascular disease monitoring based on wearable medical devices may effectively reduce COVID-19 mortality rates [8]. proposed Diagnostic Concordance of Telemedicine as Compared with Face-to-Face Care in Primary Health Care Clinics in Rural India: Randomized Crossover Trial [9]. proposed A new architecture of Internet of Things and big data ecosystem for secured smart healthcare monitoring and alerting system. The architecture they use consists of two main sub architectures, namely, Meta Fog-Redirection (MF-R) and Grouping and Choosing (GC) architecture. MF-R architecture uses big data technologies such as Apache Pig and Apache HBase for collection and storage of the sensor data (big data) generated from different sensor devices [10].

Proposed Internet of Things with Wearable Devices and Artificial Intelligence for Elderly Uninterrupted Healthcare Monitoring Systems. they use of IoT-based systems can be used to leverage these challenges [11]. proposed an Internet of Things based Social Distance Monitoring System in Covid19. This study follows the qualitative-experimental methodology as this proposed system can be implemented on the wearable device, which will help the users to maintain social distancing in real-time. There has been a rise in global demand for intelligent health surveillance and diagnosis systems for patients with critical health conditions, particularly those with severe heart diseases. Sophisticated measurement tools are used in hospitals worldwide to identify serious heart conditions. Previous researchers have work on real time monitoring and diagnosis of various heart disease. One of the drawbacks of these research was it does not provide the live location and incorporate AI system that can accurately identify patterns and abnormalities linked to different types of arrhythmias because it was trained on a large collection of ECG recordings [12].

Methodology

In the development of an IoT-based telemedicine system for ECG data collection and semi-automated analysis, a systematic and structured design process is important for ensuring that the system meets the requirements and performs effectively. The design process follows a top-down approach. The top-down approach in design is a logical methodology for complex systems that starts from a broad perspective and progressively breaks the system down into detailed specifications and implementations. The process is outlined in three major steps:

Requirements Analysis

The first step in the design process is the presentation of system requirements. This step is critical as it lays the foundation for all subsequent design activities. By clearly defining the functional and non-functional requirements, we ensure that we understanding of what the system is expected to achieve and verify it accordingly.

High-Level Framework Discussion

With the requirements established, the next step involves the presentation and discussion of the overall framework at a high level. This framework provides a macro view of the system architecture and illustrates the primary components and their interactions. By discussing the framework at this level, we establish a clear picture of how the different components of the system interact, ensuring coherence and integration in the subsequent detailed design.

Layered Design Breakdown

The final step in the design process involves breaking down the high-level framework into detailed layers. This step is where each layer of the system is designed separately, focusing on the specific functionalities and requirements of each component.

High-Level Architecture

The system is structured into three distinct layers: the edge compute layer, the cloud compute layer, and the visualization layer. processing, analysis, and visualization. The block diagram above illustrates the interactions between these layers (Figure 1).

Figure 1

biomedres-openaccess-journal-bjstr

Layer 1: Edge Compute Layer

The edge compute layer is responsible for the initial acquisition and preprocessing of ECG data. This layer consists of multiple nodes, each equipped with IoT-enabled ECG devices operated by field workers. The key components and functions of this layer include

ECG Acquisition: Field workers use portable ECG devices to capture the heart activity of patients. These devices are designed for easy use in various environments, ensuring accurate and reliable data collection.

Signal Conditioning: The raw ECG signals are conditioned to remove noise and artifacts, ensuring the data's integrity and quality.

Preprocessing: Basic preprocessing tasks, such as filtering and normalization, are performed on the edge to reduce the data load and prepare it for further analysis in the cloud.

This layer ensures that the initial stages of data handling are managed locally, reducing latency and allowing for real-time feedback to field workers.

Layer 2: Cloud Compute Layer

The cloud computing layer is an essential component in modern distributed systems, offering vast computational power, storage, and various services over the internet. It allows for the offloading of resource-intensive tasks from local devices to remote servers, thereby leveraging scalable and elastic resources that can adjust to varying workloads. This architecture enhances performance, flexibility, and cost-efficiency, as users can pay for only the resources they consume without the need for significant upfront investments in hardware. The inclusion of a cloud computing layer is primarily centered around the limitations of edge devices, the need for scalability, and cost management. Edge devices, while capable of collecting and preprocessing data, lack the computational power required to run complex machine learning algorithms. Devices used in the edge compute layer are optimized for data acquisition and basic signal processing.

The resource constraints inherent in these devices—such as limited processing power, memory, and storage capacity-make them unsuitable for executing sophisticated and computationally intensive tasks like machine learning model inference. Offloading these tasks to the cloud allows us to bypass these limitations, ensuring that the data processing and analysis are performed with the necessary computational resources. Cloud platforms, usually provide elastic resources that can be scaled up or down based on demand. The cloud compute layer is divided into two sub-layers: the modeling layer and the message distribution layer. This layer is responsible for advanced data processing, analysis, and distribution.

Modeling Sub-Layer

In this sub-layer, the preprocessed ECG data is analyzed using pre-trained machine learning models. These models are designed to detect patterns and anomalies in the ECG data. The key functions performed at this layer are:

a) The model processes the incoming ECG data, identifying potential heart conditions or irregularities.

b) Based on the analysis, the model labels the ECG data, indicating whether it detects normal or abnormal heart activity.

Message Distribution Sub-layer

This sub-layer manages the routing and storage of the processed data. The functions implemented here are:

a) The layer filters the labeled ECG data and decides where it should be stored.

b) Depending on the results of the filtering process, the data is directed to appropriate database.

Layer 3: Visualization Layer

The visualization layer is the interface between the system and the medical professionals and provides tools for data analysis and decision-making. This layer is designed to enhance the capabilities of doctors and equip them with tools and relevant information to provide optimal patient care. The dashboard provides semi-automatic analysis tools that assist doctors in interpreting the ECG data. The system flags data that requires urgent attention, such as potential heart disease cases, prioritizing these for review. Comprehensive data visualization tools are available to display anomalies, and patient histories, aiding doctors in making informed decisions. The interface allows doctors to interact with the data, adding their observations, and making clinical decisions based on the combined input from the model and their expertise.

Cloud Computing Layer

Cloud computing has ushered in a transformative era in the IT ecosystem, profoundly impacting various industries, including manufacturing. It has not only made IT services more transparent but has also introduced flexibility, scalability and reduced resource consumption. The cloud's influence extends beyond the IT sector and significantly contributes to the evolution of manufacturing practices. Traditionally, manufacturing operations relied on on-premises equipment health management systems. Cloud-based platforms can empower factory managers with real-time insights into their entire preproduction facility from anywhere and can facilitate critical equipment diagnostics and health assessments through robust data analysis capabilities. Furthermore, The data-intensive nature of digital twins necessitates substantial computing and storage resources. Cloud computing has the ability to revolutionize how energy data is collected, managed, and utilized within manufacturing facilities. Cloud solutions provide the much-needed scalability and flexibility to handle the growing volume of energy-related data generated by end devices. This scalability ensures that manufacturing facilities can adapt to evolving energy monitoring requirements seamlessly. Moreover, the centralization of data within cloud repositories simplifies data management, making it easier to organize, secure, and access critical information. In an integrated energy monitoring framework, where data from various sources must be harmoniously processed, this centralization becomes an advantage.

Machine Learning Sublayer Design

Dataset: The dataset utilized for the machine learning sublayer is the MIT-BIH Arrhythmia Database, which is an extensively used dataset in biomedical signal processing and cardiovascular research. This database contains 48 half-hour excerpts of two-channel ambulatory electrocardiogram recordings. These recordings were collected from a diverse population of patients, including both inpatients and outpatients from the Beth Israel Hospital in Boston, Massachusetts. The dataset was designed to capture a wide variety of arrhythmic events, making it an invaluable resource for developing and testing algorithms aimed at detecting and classifying cardiac arrhythmias. To prepare the dataset for machine learning applications, the half-hour excerpts are often segmented into smaller, more manageable samples. This segmentation process allows for a focused analysis of specific events and patterns within the ECG data. The signals in the dataset are preprocessed and segmented, with each segment corresponding to a heartbeat.

Splitting the Dataset: In biomedical signal classification tasks, there are two approaches used in the classification of biomedical signals: inter and intra paradigms. These paradigms define how data is split for training and testing machine learning models

Intra-Patient Paradigm: The intra-patient paradigm involves splitting the data so that both the training and testing sets contain ECG recordings from the same set of patients. This approach allows the model to learn and recognize patterns specific to these individuals. Consequently, the model tends to perform well because the variations it needs to account for are limited to those specific patients. However, this may lead to overfitting, as the model might struggle to generalize to ECG data from new, unseen patients with different physiological characteristics.

Inter-Patient Paradigm: In contrast, the inter-patient paradigm splits the data such that the training and testing sets contain recordings from different sets of patients. This setup forces the model to learn more general patterns in the ECG data that are applicable across various individuals. While this can result in lower performance metrics compared to the intra-patient approach, it enhances the model's ability to generalize and perform reliably on ECG data from new patients. This paradigm is more challenging but ultimately more realistic for real-world applications where the model must handle diverse patient data. The dataset at hand is obtained as two CSV files, one for train and test and organized using the inter patient paradigm.

The Model

There are numerous machine learning algorithms that can be employed to develop a model for arrhythmia detection. Possible candidates include Decision Trees, Random Forests, K-Nearest Neighbors (KNN), Logistic Regression, and various types of Neural Networks topologies. For this thesis, using Support Vector Machines (SVM) due to their demonstrated high accuracy in previous studies. SVMs are effective in high-dimensional spaces and are particularly well-suited for classification tasks where there is a clear margin of separation between different classes. One study by [8] comparing the performance of several classifiers on the dataset determined SVM to be the model with the best performance metrics. It is on this basis that SVM was chosen as the model of choice for this classification task. Support Vector Machines (SVMs) are supervised learning models used for classification and regression tasks. The key idea behind SVMs is to find a hyperplane that best separates different classes in the feature space, maximizing the margin between the closest points of different classes, known as support vectors (Table 1).

Table 1: Classification task.

biomedres-openaccess-journal-bjstr

Amazon Elastic Compute Cloud EC2 Instance

Amazon Elastic Compute Cloud (EC2) provides the essential scalable computing infrastructure needed for deploying the ECG analysis model within our IoT-enabled telemedicine system. This service enables us to launch virtual server instances with customizable computational power, which is crucial for handling the intensive data processing and machine learning tasks associated with ECG analysis. By utilizing EC2, we can dynamically adjust the computational resources based on the system's demand, ensuring optimal performance and cost-efficiency. The ability to select from various instance types, including those with GPU support, allows us to efficiently manage the high-volume, real-time data analysis required by our application. Furthermore, EC2's integration with other AWS services ensures robust security and seamless data handling, facilitating the secure transmission and storage of sensitive medical information. Overall, Amazon EC2 provides a flexible, reliable, and scalable environment for deploying our ECG analysis model, enhancing the system's capability to deliver timely and accurate telemedicine services.

Visualization Layer

Dashboards: Central to the system is a frontend dashboard, which provides real-time insights and facilitates informed decision-making. This dashboard serves as the user interface and visualizes the patient data. Dashboards mean different things to different people and there is no universally accepted definition of what a dashboard should be. In this work, it adopt Few’s definition [8]: “A dashboard is a visual display of the most important information needed to achieve one or more objectives; consolidated and arranged on a single screen so the information can be monitored at a glance”. A dashboard is a vital tool in modern medicine. It offers healthcare professionals a consolidated and real-time view of patient data and enhances decision-making by providing quick access to crucial metrics and historical data. Dashboards can improve efficiency, support timely interventions, and contribute to better patient outcomes by visualizing complex information in an easily interpretable format. A central consideration is the cognitive load imposed on users. The human brain's finite working memory capacity necessitates the processing of a limited amount of information at a given time. When data spans across multiple screens or requires scrolling, users must actively integrate information from various sections, leading to cognitive strain, potential information overload, and diminished comprehension. The primary goal of dashboards is to enable users to quickly interpret complex data and make informed decisions. This is why carefully choosing the metrics to be displayed is crucial because it ensures that healthcare professionals have quick and easy access to the most relevant and actionable information needed for patient diagnosis and treatment. Including only the most pertinent data helps avoid information overload, reduces the risk of misinterpretation, and enhances the overall efficiency and effectiveness of patient care.

The following are the charts and metrics shown on the dashboard:

ECG of the Selected Patient

This section displays the electrocardiogram (ECG) of the patient acquired in the manner discussed previously.

Patient Metadata

Important details such as the patient's name, age, phone number, and other relevant information are shown to provide context and ensure accurate patient identification.

Number of Unique Patients in the Database

This metric helps healthcare providers understand the scope of data available, track patient records, and manage patient load effectively.

Key ECG-Derived Metrics

ECG is a gold standard in cardiology for deriving several critical metrics. The following metrics are extracted from the ECG using various methods and shown on the dashboard.

Heart Rate

Heart rate is the number of heart beats per minute. It is a fundamental vital sign that reflects the functioning of the heart and can indicate various health conditions. Abnormal heart rates can signify arrhythmias, heart disease, or other medical conditions. Monitoring heart rate helps in assessing the patient's cardiovascular health.

ECG Quality

ECG quality assesses the accuracy and reliability of the ECG recording. This metric helps medical professionals understand the conditions under which the ECG was recorded. High-quality ECGs are crucial for accurate diagnosis. If the ECG quality is low, the data may be discarded or a new recording may be requested. This ensures that diagnostic decisions are based on reliable data.

Heart Rate Variability

Heart rate variability (HRV) refers to the variation in the time intervals between successive heartbeats, as measured by the R-R intervals on an electrocardiogram (ECG). HRV is computed using various mathematical methods applied to the series of R-R intervals extracted from the ECG signal. Key methods include time-domain, frequency-domain, and nonlinear analyses. In time-domain analysis, metrics like SDNN (Standard Deviation of NN intervals) and RMSSD (Root Mean Square of Successive Differences) quantify the variability and rhythm of heartbeats over time. HRV serves as a valuable diagnostic tool across various clinical and research applications. It provides insights into autonomic nervous system function, cardiovascular health, and overall physiological resilience. Reduced HRV is associated with increased cardiovascular risk, stress, and poorer prognosis in cardiovascular diseases. Conversely, higher HRV indicates greater adaptability, healthier autonomic function, and improved cardiovascular fitness.

Exploratory Data Analysis and Dataset Preprocessing

Class Labels

There are four class labels in the dataset encoded as follows:

N: Non- ectopic beats (normal beat)

S: Supraventricular ectopic beats

V: Ventricular ectopic beats

F: Fusion Beats

Q: Unknown Beats

N: Non-Ectopic Beats (Normal Beat)

These are regular heartbeats with no abnormalities. They follow the normal conduction pathway and are typically used as a reference or baseline to distinguish abnormal rhythms.

S: Supraventricular Ectopic Beats

These are abnormal beats originating above the ventricles, typically in the atria or the atrioventricular node. They include premature atrial contractions and can lead to arrhythmias like atrial fibrillation or supraventricular tachycardia.

V: Ventricular Ectopic Beats

These beats originate from the ventricles and include premature ventricular contractions. They can indicate ventricular arrhythmias such as ventricular tachycardia or ventricular fibrillation.

F: Fusion Beats

Fusion beats occur when a normal beat and an ectopic beat coincide, resulting in a complex that appears as a combination of both.

Q: Unknown Beats:

These are beats that cannot be classified into any specific category due to their unusual or unclear characteristics.

Entries

Each entry in the dataset corresponds to a heartbeat. Each ECG in a record is a [1 x 187] vector sampled at 125kHZ. There are a 187 columns. The first 186 is the ECG segment while the 187th is the class output label. A sample is shown below: (Figure 2). The classes are highly unbalanced as show in the plot above. In a classification task, balanced class labels are crucial for several reasons. Firstly, they ensure that the model is trained equally on each class, preventing bias towards the majority class. This balance helps the model generalize better to new, unseen data, as it learns representative features from all classes rather than focusing (Table 2).

Table 2:

biomedres-openaccess-journal-bjstr

Figure 2

biomedres-openaccess-journal-bjstr

Usability Tests

To evaluate the dashboard, a survey was conducted using google forms. To access the dashboard, one must be assigned a role from the IAM center in AWS. There is no option to publish to a public space that can be accessed by anyone with the link. Since the participants are not known beforehand and it is time consuming to assign roles on the spot, the participants were individually presented with the dashboards and asked to interact with it for a minimum of 5 minutes each. The survey was open to a diverse range of participants based on their availability and willingness to allocate a brief period of time to review and assess the dashboard's usability.

The Virtual Edge Device

The Virtual Device (VDevice)

To conduct comprehensive end-to-end system testing, a virtual edge device (hereafter referred to as the vdevice) was developed specifically for generating realistic ECGs. The necessity for such a virtual device arose from the challenge of obtaining diverse and consistent ECG samples for testing purposes. Physically acquiring ECG data for testing can be cumbersome and limited in scope, hence the development of a simulated approach using the ECGSyn waveform generator. The ECGSyn generator is a tool that generates synthetic ECG signals with customizable parameters such as heart rate, beat count, and waveform morphology. It simulates realistic ECG features including beat-to-beat variation and respiratory sinus arrhythmia is an ECG signal serves as a pivotal tool in generating pseudo but realistic ECG signals tailored for system testing. This algorithm synthesizes ECG waveforms by modeling the physiological processes and electrical activities of the heart. It incorporates parameters such as heart rate variability, baseline drift, and noise characteristics and produces signals that mimic real-world ECGs. The vdevice takes place of the edge hardware for sensing ECG. The system is tested using samples generated by this generator.

Discussion

To conclude the implementation discussion, here is are cap some key implementation details to combine all the components developed in the previous section into a complete picture. The design approach is component-based. Each component is developed independently and tested. This modular approach ensures maintainability and scalability. There are three major subsystems or blocks in the system: edge sensing for acquiring developed device and user metadata, cloud computing block, and visualization block. The cloud computing block consists of two sublayers: modeling and message distribution. The modeling layer has a machine-learning model for labeling ECG samples. The message distribution layer handles the complexities of routing the labeled samples to their final destination. The final layer is the visualization layer, which consists of logic for fetching and visualizing data from the message broker. Dropdowns and drill-through are provided to facilitate semiautomatic analysis of ECG data. The edge sensing layer is implemented on the ESP32 microcontroller where the sensing unit is ad3282 sampled at 125 Hz. The cloud computing layer is implemented on AWS using the EC2 instance to run the SVM model, and the AWS IoT Core MQTT broker implements the distribution system.

The visualization block is realized using the Dash Plotly web framework. The system's components interface with each other through various protocols. The edge sensing layer interfaces with the model on EC2 via Web Socket, the model is connected to the distribution system via MQTT, and frontends that consume the labeled data get it via MQTT. The wireless standard used Bluetooth (IEEE 802.15). The interface is implemented as a SPP over Bluetooth. SPP (Serial Port Profile) is a Bluetooth profile designed to emulate a traditional serial cable connection over a wireless Bluetooth link. It enables devices to communicate serially, much like they would over UART connections described earlier, but without the physical cables

Client-Server Model of Bluetooth

One common communication model used in Bluetooth is the client-server model The client-server model in Bluetooth networking defines the roles and interactions between devices within a Bluetooth network. It is essential for understanding how devices communicate and exchange data over Bluetooth connections:

a) Server (or acceptor): The Bluetooth server device typically provides services or resources that other devices can access. Servers advertise their services to nearby devices and wait for connection requests.

b) Client (acceptor): The Bluetooth client initiates connections to servers to access their services or resources. Clients actively scan for nearby servers, discover available services, and establish connections. Once connected, clients can send requests to servers and receive responses.

To implement the wireless interface, we run an SPP acceptor on the ESP32 and an SPP initiator on another Bluetooth enabled device. This is the device that will be used for collecting the data. A smartphone running an android operating system is used. The application running on the smartphone was developed using the MIT app inventor. MIT App Inventor is a visual programming environment that allows users to build Android applications using a block-based interface. The app provides a GUI for collecting the metadata. The fields collected are same as the ones in the serial case. Below is a picture of the running application: In both interfaces, when a form is submitted, the user interface task (serial or wireless) puts the form on a buffer where the websocket client will find, format and transmit it (Figure 3).

Figure 3

biomedres-openaccess-journal-bjstr

Sensor Data Acquisition and Preprocessing Task

The sensor data acquisition directly acquires the raw sensor values, scales it and make then constructs a payload from the metadata collected form the user interface. When called, it samples the ADC at 125KHz for 8 seconds. A sample payload is shown below: This task implements two sub-routines. During system testing, the Device generates synthetic ECG signals using the ECG Syn algorithm, mimicking scenarios such as normal heart rhythms, arrhythmias, and other cardiac anomalies. These simulated ECG signals are then transmitted to the cloud-based system through simulated communication channels, emulating the actual data flow in operational settings (Figure 4).

Figure 4

biomedres-openaccess-journal-bjstr

{

"name": "Eke Chimuanya",

"age": 38,

"sex": "f",

"phone": "0803458234",

"ecg": [

285, 46, -92, -36, -15, -8, 3, 21, 46, 73, 94, 104, 98, 78, 52, 27, 6, -7, -14, -17, -17, -16, -15, -14, -12, -9, -3, 12, 37, 61, 68, 52, 26, 8, -3, -31, -38, 179, 260, -20, -53, -3, 6, 14, 29, 50, 77, 102, 118, 120, 106, 82, 55, 32, 16, 7, 3, 2, 3, 4, 5, 6, 8, 13, 26, 49, 74, 83, 68, 42, 21, 10, -13, -22, 196, 259, -18, -41, 5, 14, 23, 38, 59, 85, 109, 124, 123, 107, 82, 55, 31, 16, 7, 3, 2, 2, 3, 3, 4, 6, 12, 27, 52, 74, 78, 59, 32, 13, 2, -29, -3, 253, 187, -63, -46, -7, 0, 8, 23, 45, 71, 95, 109, 109, 93, 68, 40, 16, 0, -10, -15, -17, -17, -17, -17, -17, -16, -12, -1, 20, 45, 58, 47, 21, -2, -15, -32, -69, 79, 283, 31, -102, -47, -24, -18, -8, 8, 32, 58, 80, 92, 88, 70, 44, 16, -6, -22, -31, -35, -37, -37, -36, -36, -35, -34, -30, -20, 0, 27, 44, 38, 14, -10, -25, -37, -73, -6, 252, 122, -96, -62, -29, -21, -11, 4, 26, 53, 77, 91, 91, 76, 51, 24, 1, -13, -22, -26, -27, -26, -25, -24, -23, -21, -16, -5, 15, 42, 58, 50, 26, 2, -11, -25, -60, 35, 285, 90, -80, -29, -5, 1, 13, 32, 56, 83, 105, 114, 106, 86, 60, 35, 15, 3, -2, -4, -4, -3, -2, 0, 0, 4, 14, 33, 60, 79, 75, 51, 27, 13, 0, -36, 74, 308, 82, -60, -6, 13, 20, 32, 51, 76, 102, 122, 129, 120, 98, 70, 45, 26, 15, 9, 6, 6, 6, 6, 7, 8, 11, 21, 42, 68, 82, 72, 46, 24, 11, -8, -39, 133, 294, 18, -69, -12, 1, 8, 20, 39, 64, 89, 109, 116, 107, 87, 60, 33, 13, 0, -5, -8, -9, -10, -9, -9, -9, -7, -1, 14, 38, 60, 63, 42, 15, -3, -14, -45, -33, 207, 210, -60, -74, -25, -16, -8, 5, 25, 50, 76, 94, 99, 88, 65, 38, 12, -7, -20, -26, -29, -30, -30, -30, -29, -29, -27, -21, -6, 17, 41, 48, 32, 5, -15, -26, -49, -74, 112, 274, 0, -93, -37, -21, -14, -2, 16, 40, 66, 87, 95, 87, 66, 39, 14, -6, -19, -26, -29, -29, -29, -28, -27, -26]

]

}

Conclusion

As technology advances, moving machine learning to the edge presents a promising direction for future research. The current system leverages cloud computing for processing, which allows for complex models and extensive computational resources. However, there is a growing trend towards edge computing, where data processing occurs closer to the data source. Optimizing the system for edge computing could save on latency and bandwidth. Although edge computing comes with trade-offs, such as simplifying the model(s) used due to hardware constraints, the benefits of reduced latency and increased responsiveness could outweigh these limitations in specific contexts. Another area for future research is conducting extensive clinical trials to validate the system's performance in real-world settings. These trials should involve a diverse patient population across multiple healthcare institutions to ensure the system's generalizability and robustness. Clinical trials would provide valuable insights into the system's practical application, highlight potential areas for improvement, and build confidence among healthcare providers and patients in the system's reliability and accuracy.

References

  1. Rahaman MM Islam, MR Islam, MS Sadi, S Nooruddin (2020) Developing IoT based smart health monitoring systems: a review. Rev Intell Artif 33: 435-440.
  2. R Al-Ali, IA Zualkernan, M Rashid, R Gupta, M Alikarar (2020) A smart home energy management system using IoT and big data analytics approach. IEEE Trans. Consum. Electron 63: 426-434.
  3. Chen J Wan, L Shu, P Li, M Mukherjee, B Yin (2020) Smart factory of industry 4.0: key technologies, application case, and challenges. IEEE Access 6: 6505-6519.
  4. Nouri S, Khoong EC, Lyles CR, Karliner L (2020) Addressing equity in telemedicine for chronic disease management during the COVID-19 pandemic. Innov Care Deliv.
  5. Muhammad Zia Rahman, Muhammad Azeem Akbar, Víctor Leiva, Abdullah Tahir, Muhammad Tanveer Riaz, et al. (2023) An intelligent health monitoring and diagnosis system based on the internet of things and fuzzy logic for cardiac arrhythmia COVID-19 patients. Comput Biol Med 154: 106583.
  6. Dami S (2022) Internet of things-based health monitoring system for early detection of cardiovascular events during COVID-19 pandemic. World J Clin Cases 10(26): 9207-9218.
  7. Liang Tan, Keping Yu, Ali Kashif Bashir, Xiaofan Cheng, Fangpeng Ming, et al. (2023) Toward real-time and efficient cardiovascular monitoring for COVID-19 patients by 5G-enabled wearable medical devices: a deep learning approach. Neural Comput Appl 35(19): 13921-13934.
  8. Neha Verma, Bimal Buch, Radha Taralekar, Soumyadipta Acharya (2023) Diagnostic Concordance of Telemedicine as Compared With Face-to-Face Care in Primary Health Care Clinics in Rural India: Randomized Crossover Trial. JMIR Form Res 7: e42775.
  9. Manogaran G, Varatharajan R, Daphne Lopez, Kumar PM, Sundarasekar R, et al. (2017) A new architecture of Internet of Things and big data ecosystem for secured smart healthcare monitoring and alerting system. 82.
  10. Awotunde J Bamidele, Sunday Adeola Ajagbe, Hector Florez (2022) Internet of Things with Wearable Devices and Artificial Intelligence for Elderly Uninterrupted Healthcare Monitoring Systems.
  11. Rifat-Ibn-Alam, Nyme Ahmed, Syed Nafiul Shefat, Md Taimur Ahad (2022) An Internet of Things based Social Distance Monitoring System in Covid19”. Int J Advanced Networking and Applications 13(5): 5128-5133.
  12. Moody G, Mark R (2005) MIT-BIH Arrhythmia Database. v1.0.0.