Minireviews Open Access
Copyright ©The Author(s) 2024. Published by Baishideng Publishing Group Inc. All rights reserved.
World J Clin Cases. Nov 26, 2024; 12(33): 6613-6619
Published online Nov 26, 2024. doi: 10.12998/wjcc.v12.i33.6613
Current evidence on artificial intelligence in regional anesthesia
Bhanu Pratap Swain, Deb Sanjay Nag, Rishi Anand, Himanshu Kumar, Pradip Kumar Ganguly, Niharika Singh, Department of Anaesthesiology, Tata Main Hospital, Jamshedpur 831001, India
Bhanu Pratap Swain, Rishi Anand, Himanshu Kumar, Department of Anesthesiology, Manipal Tata Medical College, Jamshedpur 831017, India
ORCID number: Deb Sanjay Nag (0000-0003-2200-9324).
Co-first authors: Bhanu Pratap Swain and Deb Sanjay Nag.
Author contributions: Swain BP, Nag DS, and Anand R designed the overall concept and outline of the manuscript; Kumar H, Ganguly PG, and Singh N contributed to the discussion and design of the manuscript; Swain BP, Nag DS, Anand R, Kumar H, Ganguly PK, and Singh N contributed to the writing, and editing the manuscript and review of literature; all of the authors read and approved the final version of the manuscript to be published.
Conflict-of-interest statement: All authors declare no conflict of interest in publishing the manuscript.
Open-Access: This article is an open-access article that was selected by an in-house editor and fully peer-reviewed by external reviewers. It is distributed in accordance with the Creative Commons Attribution NonCommercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited and the use is non-commercial. See: https://creativecommons.org/Licenses/by-nc/4.0/
Corresponding author: Deb Sanjay Nag, MD, Doctor, Department of Anaesthesiology, Tata Main Hospital, C Road West, Northern Town, Bistupur, Jamshedpur 831001, India. ds.nag@tatasteel.com
Received: June 18, 2024
Revised: September 11, 2024
Accepted: September 19, 2024
Published online: November 26, 2024
Processing time: 101 Days and 0.9 Hours

Abstract

The recent advancement in regional anesthesia (RA) has been largely attributed to ultrasound technology. However, the safety and efficiency of ultrasound-guided nerve blocks depend upon the skill and experience of the performer. Even with adequate training, experience, and knowledge, human-related limitations such as fatigue, failure to recognize the correct anatomical structure, and unintentional needle or probe movement can hinder the overall effectiveness of RA. The amalgamation of artificial intelligence (AI) to RA practice has promised to override these human limitations. Machine learning, an integral part of AI can improve its performance through continuous learning and experience, like the human brain. It enables computers to recognize images and patterns specifically useful in anatomic structure identification during the performance of RA. AI can provide real-time guidance to clinicians by highlighting important anatomical structures on ultrasound images, and it can also assist in needle tracking and accurate deposition of local anesthetics. The future of RA with AI integration appears promising, yet obstacles such as device malfunction, data privacy, regulatory barriers, and cost concerns can deter its clinical implementation. The current mini review deliberates the current application, future direction, and barrier to the application of AI in RA practice.

Key Words: Artificial intelligence; Regional anesthesia; Machine learning; Ultrasonography; Nerve block

Core Tip: Proficiency in ultrasound-guided regional anesthesia (UGRA) demands an accurate interpretation of sono-anatomy and precise delivery of local anesthetics in the intended location by maneuvering a block needle. Integration of artificial intelligence (AI) can make the job of clinicians a lot easier by deciphering the correct anatomy and providing real-time needle guidance. It promises to improve the success of the UGRA procedures and reduce the complication rate by minimizing human error. Furthermore, AI can be a great tool in education and training. It can help the trainees to learn regional anesthesia techniques faster and more efficiently. Although the future looks promising, the full integration of AI in clinical practice needs user validation and ample data on clinical outcomes.



INTRODUCTION

Before the advent of ultrasound in regional anesthesia (RA), peripheral nerve blocks (PNBs) were traditionally practiced with landmark-based techniques for anatomic guidance[1]. Ultrasound guidance has transformed the practice of RA, making it safer and more successful. With the ability to visualize nerves, muscle planes, blood vessels, and needles in real time, ultrasound-guided RA (UGRA) provides exceptional accuracy and precision[1]. However, mastering the skill of UGRA is not an easy feat. It requires extensive training, a deep understanding of sono-anatomy, and good hand-eye coordination. The learning curve can be steep, and trainees often struggle with ultrasound image acquisition and needle tracking[2]. These challenges are especially pronounced in cases involving obese patients, those with tissue edema, and deep-seated nerve plexuses[3]. Recent developments in echogenic needles and needle guidance systems have improved needle visibility[4]. Despite these advancements, there is still limited development in improving the quality of ultrasound images and their interpretation. With artificial intelligence (AI) integration, we can overcome these challenges and improve the safety and success of UGRA procedures.

AI is revolutionizing the medical field, providing clinicians with crucial support to deliver swift, effective treatment to their patients[5]. With AI technology, machines and computers can mimic human intelligence and problem-solving abilities, overcoming common human limitations such as fatigue and loss of attention. AI can interpret things that the human eye may struggle to decipher or take a long time to do so.

In RA, AI-guided solutions can play a critical role in enhancing the success rate of procedures and minimizing complications. AI can assist clinicians with ultrasound image interpretation by highlighting relevant anatomical structures and providing guidance in needle insertion, advancement, and injection of local anesthetics[6]. It can also facilitate the training of novice clinicians in UGRA techniques, allowing them to acquire the required skills quickly and efficiently, reducing the learning curve associated with the procedure[7].

The use of AI technology for ultrasound-guided PNBs and neuraxial blocks has enormous potential. Nonetheless, before implementing these AI models in clinical settings, they must undergo real-case scenario studies to ensure their validity, utility, and safety.

FUNDAMENTALS OF AI

AI technology is based on two fundamental concepts-machine learning (ML) and deep learning (DL). ML involves training computers to learn from data and algorithms, improving their performance over time. It requires exposure to training data and algorithms to identify patterns and improve decision-making. On the other hand, DL is a subset of ML that uses multi-layered neural networks, called deep neural networks, to perform complex decision-making. These networks are structured similarly to the human brain, enabling them to identify patterns and relationships, recognize phenomena, evaluate different possibilities, and make predictions and decisions. A convolutional neural network (CNN) is a type of DL algorithm that is particularly useful in tasks that involve recognizing and processing images. The architecture of CNN is based on the way the human brain processes visual information. By training on large amounts of data, CNN can learn to recognize patterns and features that are associated with specific objects or classes. This makes them especially valuable in the field of medical imaging, where CNN can help clinicians quickly and accurately identify abnormalities[8]. Although AI is revolutionizing healthcare, there are challenges to overcome before we realize its full potential. The availability of quality medical data in accordance with data security and privacy concerns would be critical to achieving the desired outcomes. Although AI-guided clinical therapy may appear evidence-based and realistic, hallucinations can create a false narrative necessitating human involvement for appropriate care delivery[8].

Scope of AI in ultrasound guided RA

Performing RA with the help of ultrasound guidance is a complex multitasking process that involves many steps. These steps include placing the ultrasound probe in the area of interest, acquiring the correct image, deciding where to insert the needle, inserting the needle correctly, adjusting the ultrasound probe to see the needle tip, administering the drug, checking that the anesthetic is spreading correctly, monitoring for complications, and finally assessing the effectiveness of the block. To carry out these tasks, practitioners should have a solid understanding of anatomy and good hand-eye coordination. For novice practitioners and trainees, it can be overwhelming. AI can now assist or supervise all these steps using information from clinicians, patients, and procedures. Some of the current AI-assisted technologies used in UGRA are real-time anatomic guidance, target detection, and needle tracking[7].

AI-assisted real time anatomic guidance in UGRA

Ultrasound images are currently available in two dimensions (2D) and grayscale. Anatomical structures appear in varying shades of grey based on the tissue's echogenicity. Moreover, the human interpretation of the greyscale ultrasound images is nonintuitive and limited by experience. AI can overcome this challenge by highlighting the anatomical structures and providing a visual cue to the operator, making it easier to focus on the intended structure. One of the most common AI technologies used to process ultrasound images is segmentation, where color overlays of various sono-anatomical features are used to differentiate relevant anatomic structures[9-13]. Other less common methods include using bounding boxes around the intended structure, displaying expanding circular rings from the center of the nerve plexus, and putting name tags over the structure in question[14,15].

AI can also help the operator in getting the best ultrasound image of the intended structures during the ultrasound scanning phase of the PNBs. Some known methodologies include displaying successful scan indicators[13,16], specifying optimal vs non-optimal images[17], creating a three-dimensional (3D) model of the scanned image[18], and guidance in probe[10] manipulation to get the ideal image. AI solutions are also available for ultrasound-guided central neuraxial blocks that can significantly improve the first-pass success rate and reduce complications. Common techniques include identifying the vertebral level, outlining the distance from the skin to the posterior complex or epidural space, structure identification by segmentation or bounding boxes, and marking the optimal needle insertion point[19-22].

AI models for PNBs

Currently, there are few commercially available AI solutions for ultrasound guided PNBs. One of the most studied models is the “ScanNav Anatomy Peripheral Nerve Block” (manufactured by Intelligent Ultrasound, Cardiff, United Kingdom)[23]. It helps in image interpretation by providing a color overlay of key sono-anatomical structures in 10 commonly performed PNBs (interscalene, supraclavicular, superior trunk, axillary, erector spinae, suprainguinal fascia-iliaca, femoral, adductor canal, and popliteal nerve block). This technology uses CNN, an advanced DL based on U-Net architecture to delineate anatomical structures in real-time ultrasound. A database of 80000 ultrasound images is used by the algorithm as a reference for labeling the sono-anatomical structures in the view. The ScanNav technology has recently undergone many validation studies in clinical settings involving experts, non-experts, and trainees[6,9,11,24,25]. All the studies have concluded that ScanNav AI-based technologies have tremendous potential to help non-experts and experts in RA practice.

NerveBlox, manufactured by Smart Alfa Teknoloji San. Ve Tic. A.S. in Ankara, Turkey, is an AI-powered solution for PNBs similar to ScanNav. This device utilizes real-time ultrasound images and a CNN algorithm to label (color overlay) important anatomical landmarks like muscles, nerves, bones, and blood vessels. It supports 12 types of nerve blocks such as inter-scalene brachial plexus block, superficial cervical plexus block, supraclavicular brachial plexus block, infraclavicular brachial plexus block, axillary brachial plexus block, pectoralis nerve blocks, transversus abdominis plane block, erector spinae plane block, rectus sheath block, femoral nerve block, adductor canal block, and popliteal sciatic nerve block[26]. A few studies have validated Nerveblox AI software as a handy tool for real-time decision-making during PNBs procedures[13,15,16]. Both Nerveblox and ScanNav are external devices with a separate display screen that needs to be plugged into a compatible ultrasound machine to function.

Some of the other commercially available AI devices for PNBs are cNerve (GE Healthcare, Chicago, IL, United States)[27], NerveTrack (Samsung, Suwon, South Korea)[28], and Smart Nerve (Mindray, Shenzhen, China)[29].

AI models for central neuraxial block

There are two AI systems available that can assist with central neuraxial block (CNB) procedures. One of these is Accuro, a handheld device manufactured by Rivanna Medical (Charlottesville, VA, United States)[30]. This device uses SpineNav3D AI-Based Spine Recognition technology to automatically detect bony landmarks and provide a real-time 3D view of the scan plane. It accurately determines the depth of the epidural space and offers real-time tracking of the needle[30]. Studies have shown that Accuro can reduce the number of needle redirects and increase the chances of first-pass success[22,31-34]. The other AI-enabled software is uSINE, manufactured by HiCura Medical (Singapore). It is designed for automatic identification of spinal level and needle guidance during ultrasound guided CNBs[35]. USINE is an external device that must be connected to a compatible ultrasound machine. The software guides the operator in achieving an ideal ultrasound view. Once the desired view is obtained, the operator marks the skin puncture site and then performs the neuraxial procedure by introducing the needle through the marked site. However, real-time needle tracking is not available with this software. One study reported a 92% first attempt of dural puncture success during spinal anesthesia in 100 obstetric patients when uSINE was used to identify the surface landmark[36].

Role of AI in training

Mastering the safe performance of RA can be a challenging task for trainees due to the steep learning curve involved. However, the advent of AI-assisted technologies has paved the way for an easier and more efficient training process. With multiple AI-integrated devices available on the market, trainees can now easily overcome the difficulties they face while learning RA. These devices can highlight key anatomical structures and reconstruct 3D anatomy from 2D ultrasound images, making it easier for trainees to visualize 3D anatomy. Augmented reality (AR) has revolutionized simulation-based training, enabling trainees to improve hand-eye coordination, positioning, and dexterity issues during needle manipulation[37,38]. Devices like Needle Trainer™, manufactured by Intelligent Ultrasound (Cardiff, United Kingdom), provide a non-invasive simulation of needle manipulation on a live participant using a retractable needle and a superimposed virtual image on the ultrasound scan[39,40]. The AR experience can be further enhanced by providing haptic feedback to the operator, which gives the feel of a real needle and tissue. The Regional Anesthesia Simulator and Assistant is another AI-based simulator that offers a virtual reality environment with haptic feedback for needle manipulation in RA procedures[41]. With such advanced technology at their disposal, trainees can now learn RA more efficiently and safely, without compromising on patient safety.

Furthermore, real-time feedback during training can significantly improve the learning process of RA. Recently, eye-tracking technology has been studied in UGRA to assess visual attention to relevant anatomical structures and decision-making between experts and novices[42,43]. The study found that experts tend to focus more on clinically relevant sono-anatomical features while novices struggle to do so. By integrating AI with eye-tracking technology, trainees can be reminded to focus during live procedures, which can accelerate their learning process. Similarly, a study was conducted on hand motion analysis during the performance of UGRA by both experts and trainees. The imperial college surgical assessment device, which is a validated measure of hand movement during surgical training, was used to monitor hand movement during the scanning and needling phases of UGRA. The study concluded that experts performed significantly better than the trainees[44]. A recent study used Onvision needle tip tracking technology (B Braun Melsungen AG, Melsungen, Germany) to analyze procedure time, hand movement, and path length during simulated ultrasound-guided PNBs procedures in a porcine phantom model[45]. The study showed a significant reduction in procedure time and hand movement for out-of-plane techniques when this new technology was used compared to when it was not. A similar study on human volunteers for ultrasound-guided lumbar plexus block demonstrated reduced hand movements and path lengths[46]. Hence, AI technology can utilize the data from these devices to improve trainee performance during training.

Table 1 summarizes the currently available commercial devices for PNBs and central neuraxial blocks[23,26-30,35,40,45]. Realtime analysis of ultrasound images through AI algorithms by highlighting key anatomical structures can enhance their visualization and minimize the risk of accidental puncture of key vital structures. AI guided precise needle placement can potentially lead to higher rates of successful blocks, enhanced pain management and faster recovery. Similarly fewer complications like hematoma, nerve damage and infection would also translate into better outcomes.

Table 1 Commercially available devices.
Device name
Manufacturer
Features
Remarks
ScanNav Anatomy Peripheral Nerve Blocks[23]Intelligent Ultrasound, Cardiff, United Kingdom
Color overlay of key sono-anatomical structures in 10 commonly performed peripheral nerve blocksPlug-and-play device, compatible with selected ultrasound machines
Nerve blox[26]Smart Alfa Teknoloji San. Ve Tic. A.S. in Ankara, TurkeyColor overlay of anatomical landmarks and supports 12 types of nerve blocks Plug-and-play device, compatible with selected ultrasound machines
Nerve track[28]Samsung, Suwon, South KoreaYellow box around the nervesIn-built AI feature in the ultrasound machine
Smart nerve[29]Mindray Shenzhen, ChinaColor overlay of neural structures, real-time needle navigation based on 3D magnetic field technologyIn-built AI feature in the ultrasound machine
CNerve[27]GE Healthcare, Chicago, IL, United States
Highlights the target nerve to distinguish it from surrounding anatomical structuresAI feature in-built in ultrasound machine
Accuro[30]Rivanna Medical Charlottesville, VA, United States
Provide a real-time 3D view of spine with accurate determination of the depth of epidural spaceHandheld device
USINE[35]HiCura Medical SingaporeRealtime identification of spinal level during central neuraxial blocks by using name tagsPlug-and-play device, compatible with most ultrasound probes
Needle Trainer™[40]Intelligent Ultrasound Cardiff, United KingdomProvides non-invasive simulation of needle manipulation on a live participantDo not provide haptic feedback of needle manipulation
Onvision® Needle Tip Tracking[45]B Braun Melsungen AG, Melsungen, GermanyOnvision accurately indicates the needle tip position inside the body both in and out of the ultrasound viewing planeExclusively works with Philips Xperius ultrasound system
Constraints in integration of AI in RA

AI has displayed remarkable potential in the field of RA and is growing at an accelerated pace. However, the complete implementation of AI in clinical practice has several challenges.

Currently, numerous AI-enabled devices or models are available, but only a few have undergone external validation. The paucity of data on patient outcomes utilizing these devices and the unstructured and heterogeneous literature available are further challenges. Besides, the absence of standards to verify the accuracy and utility of these devices constitutes a significant impediment to their incorporation into clinical practice[47].

Effective ML relies on thorough training with vast and diverse datasets. However, the availability and quality of patient data remain a significant concern that can hinder the advancement of AI. Transfer learning emerges as a valuable method to tackle this challenge by leveraging knowledge gained from one domain and applying it to another[48]. Furthermore, if the input data is flawed, the AI model's output will contain inaccuracies, potentially posing a serious impact on the efficacy and safety of medical procedures[49]. Additionally, the ethical and legal aspects, along with privacy concerns stemming from such data, need to be carefully addressed. The Alan Turing Institute, a leading center for data science and AI, has developed comprehensive guidelines for ethics and safety in this domain and needs to be referred to whenever dealing with ethical matters related to patient data management[50].

Though AI-assisted devices possess tremendous abilities, they may fail, just like any other technology. Misidentification of anatomy and the AI model's emphasis on the wrong features may lead to undesired outcomes. The accuracy of AI in specific populations, such as pediatric age groups, geriatric populations, and patients with altered anatomy, is not well-studied[7]. Therefore, over-reliance on AI assistance has downsides, particularly with trainees and learners who lack confidence in their knowledge of anatomy. A thorough knowledge of relevant anatomy is essential. It is advisable not to replace traditional expert mentoring with AI-assisted devices, and trainees should learn standard techniques of ultrasound image acquisition before exposure to AI-assisted devices.

The implementation of AI is further hindered by the high costs associated with it. However, as the usage of this technology becomes more widespread, the costs are expected to decrease gradually, making it more affordable and accessible to all in the long run.

AI in RA represents a multidisciplinary field that thrives on collaboration between clinicians, engineers, and administrators. It's imperative to align these disciplines to ensure the successful integration of AI in RA. Regulatory authorities should play a pivotal role in establishing policies that promote the ethical sharing of data across disciplines and institutes. This approach will not only foster the rapid advancement of AI technology but also accelerate its seamless integration into clinical anesthesia practice.

These challenges can only be overcome by collaborating and data-sharing between healthcare institutes and research centers by ensuring the availability of diverse yet representative datasets. Rigorous validation in various clinical settings across geographic locations would help standardize and benchmark evaluation metrics to compare different models. A multidisciplinary approach involving industry, clinicians, researchers, regulatory bodies, and experts in biomedical ethics is the key to unlocking the complete potential of AI in RA.

Being a relatively new development with significant potential, AI models are slowly getting integrated into commercially available ultrasound machines for PNBs and central neuraxial blocks. It can potentially shorten procedure times and ensure financial benefits. Similarly, precise needle placement and drug deposition can potentially result in reduced complications, faster recovery, and reduced healthcare costs. However, with widespread adoption and value realization, the cost of hardware and software for AI in RA is expected to steadily decrease over the next few years.

CONCLUSION

Initial reports suggest that AI applications have tremendous potential to be a game-changer in RA. Currently, several AI models are available for peripheral and central neuraxial blocks. Most of these models use deep neural networks trained on large amounts of data to assist clinicians in real-time decision-making by aiding in image interpretation and needle tracking. Additionally, AI can revolutionize education and training in RA, making simulation-based training more pragmatic and intuitive, thus enabling trainees to learn faster. However, some barriers still exist in its assimilation into clinical practice. The common impediments include insufficient patient data on the use of these devices, lack of standardization, and ethical concerns. Clinicians, scientists, and policymakers must coordinate and standardize the technology to overcome these barriers.

In conclusion, a new era of AI-assisted RA has begun. Anaesthesiologists should embrace this new technology and enhance their skills, reduce the learning curve, and provide better patient care by leveraging AI-guided solutions for RA.

Footnotes

Provenance and peer review: Invited article; Externally peer reviewed.

Peer-review model: Single blind

Corresponding Author's Membership in Professional Societies: Indian Society of Anaesthesiology, No. S2863.

Specialty type: Medicine, research and experimental

Country of origin: India

Peer-review report’s classification

Scientific Quality: Grade A, Grade A

Novelty: Grade B, Grade B

Creativity or Innovation: Grade B, Grade B

Scientific Significance: Grade A, Grade B

P-Reviewer: Liang P; Qin XR S-Editor: Luo ML L-Editor: A P-Editor: Zheng XM

References
1.  Salinas FV, Hanson NA. Evidence-based medicine for ultrasound-guided regional anesthesia. Anesthesiol Clin. 2014;32:771-787.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 25]  [Cited by in F6Publishing: 21]  [Article Influence: 2.1]  [Reference Citation Analysis (0)]
2.  Delvi MB. Training in ultrasound guided blocks. Saudi J Anaesth. 2011;5:119-120.  [PubMed]  [DOI]  [Cited in This Article: ]  [Reference Citation Analysis (0)]
3.  Henderson M, Dolan J. Challenges, solutions, and advances in ultrasound-guided regional anaesthesia. BJA Education. 2016;16:374-380.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 15]  [Cited by in F6Publishing: 16]  [Article Influence: 2.0]  [Reference Citation Analysis (0)]
4.  Hebard S, Hocking G. Echogenic technology can improve needle visibility during ultrasound-guided regional anesthesia. Reg Anesth Pain Med. 2011;36:185-189.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 83]  [Cited by in F6Publishing: 84]  [Article Influence: 6.5]  [Reference Citation Analysis (0)]
5.  Davenport T, Kalakota R. The potential for artificial intelligence in healthcare. Future Healthc J. 2019;6:94-98.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 1432]  [Cited by in F6Publishing: 840]  [Article Influence: 168.0]  [Reference Citation Analysis (0)]
6.  Bowness J, El-Boghdadly K, Burckett-St Laurent D. Artificial intelligence for image interpretation in ultrasound-guided regional anaesthesia. Anaesthesia. 2021;76:602-607.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 17]  [Cited by in F6Publishing: 40]  [Article Influence: 10.0]  [Reference Citation Analysis (0)]
7.  Viderman D, Dossov M, Seitenov S, Lee MH. Artificial intelligence in ultrasound-guided regional anesthesia: A scoping review. Front Med (Lausanne). 2022;9:994805.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 2]  [Cited by in F6Publishing: 2]  [Article Influence: 1.0]  [Reference Citation Analysis (0)]
8.  Alowais SA, Alghamdi SS, Alsuhebany N, Alqahtani T, Alshaya AI, Almohareb SN, Aldairem A, Alrashed M, Bin Saleh K, Badreldin HA, Al Yami MS, Al Harbi S, Albekairy AM. Revolutionizing healthcare: the role of artificial intelligence in clinical practice. BMC Med Educ. 2023;23:689.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 6]  [Cited by in F6Publishing: 136]  [Article Influence: 136.0]  [Reference Citation Analysis (0)]
9.  Bowness J, Varsou O, Turbitt L, Burkett-St Laurent D. Identifying anatomical structures on ultrasound: assistive artificial intelligence in ultrasound-guided regional anesthesia. Clin Anat. 2021;34:802-809.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 11]  [Cited by in F6Publishing: 23]  [Article Influence: 7.7]  [Reference Citation Analysis (0)]
10.  Smistad E, Iversen DH, Leidig L, Lervik Bakeng JB, Johansen KF, Lindseth F. Automatic Segmentation and Probe Guidance for Real-Time Assistance of Ultrasound-Guided Femoral Nerve Blocks. Ultrasound Med Biol. 2017;43:218-226.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 10]  [Cited by in F6Publishing: 7]  [Article Influence: 1.0]  [Reference Citation Analysis (0)]
11.  Bowness JS, El-Boghdadly K, Woodworth G, Noble JA, Higham H, Burckett-St Laurent D. Exploring the utility of assistive artificial intelligence for ultrasound scanning in regional anesthesia. Reg Anesth Pain Med. 2022;47:375-379.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in F6Publishing: 22]  [Reference Citation Analysis (0)]
12.  Zhao Y, Zheng S, Cai N, Zhang Q, Zhong H, Zhou Y, Zhang B, Wang G. Utility of Artificial Intelligence for Real-Time Anatomical Landmark Identification in Ultrasound-Guided Thoracic Paravertebral Block. J Digit Imaging. 2023;36:2051-2059.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 1]  [Reference Citation Analysis (0)]
13.  Ahiskalioglu A, Yayik AM, Karapinar YE, Tulgar S, Ciftci B. From ultrasound to artificial intelligence: a new era of the regional anesthesia. Minerva Anestesiol. 2022;88:640-642.  [PubMed]  [DOI]  [Cited in This Article: ]  [Reference Citation Analysis (0)]
14.  Alkhatib M, Hafiane A, Vieyres P, Delbos A. Deep visual nerve tracking in ultrasound images. Comput Med Imaging Graph. 2019;76:101639.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 7]  [Cited by in F6Publishing: 4]  [Article Influence: 0.8]  [Reference Citation Analysis (0)]
15.  Erdem G, Ermiş Y, Özkan D. Peripheral nerve blocks and the use of artificial intelligence-assisted ultrasonography. J Clin Anesth. 2022;78:110597.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 1]  [Cited by in F6Publishing: 1]  [Article Influence: 0.3]  [Reference Citation Analysis (0)]
16.  Gungor I, Gunaydin B, Oktar SO, M Buyukgebiz B, Bagcaz S, Ozdemir MG, Inan G. A real-time anatomy ıdentification via tool based on artificial ıntelligence for ultrasound-guided peripheral nerve block procedures: an accuracy study. J Anesth. 2021;35:591-594.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 4]  [Cited by in F6Publishing: 13]  [Article Influence: 4.3]  [Reference Citation Analysis (0)]
17.  Jo Y, Lee D, Baek D, Choi BK, Aryal N, Jung J, Shin YS, Hong B. Optimal view detection for ultrasound-guided supraclavicular block using deep learning approaches. Sci Rep. 2023;13:17209.  [PubMed]  [DOI]  [Cited in This Article: ]  [Reference Citation Analysis (0)]
18.  Smistad E, Lindseth F. Real-Time Automatic Artery Segmentation, Reconstruction and Registration for Ultrasound-Guided Regional Anaesthesia of the Femoral Nerve. IEEE Trans Med Imaging. 2016;35:752-761.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 30]  [Cited by in F6Publishing: 13]  [Article Influence: 1.6]  [Reference Citation Analysis (0)]
19.  Ikhsan M, Kok Kiong Tan, Ting Ting Oh, Lew JP, Ban Leong Sng. Gabor-based automatic spinal level identification in ultrasound. Annu Int Conf IEEE Eng Med Biol Soc. 2017;2017:3146-3149.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 1]  [Cited by in F6Publishing: 2]  [Article Influence: 0.3]  [Reference Citation Analysis (0)]
20.  Hetherington J, Brohan J, Rohling R, Gunka V, Abolmaesumi P, Albert A, Chau A. A novel ultrasound software system for lumbar level identification in obstetric patients. Can J Anaesth. 2022;69:1211-1219.  [PubMed]  [DOI]  [Cited in This Article: ]  [Reference Citation Analysis (0)]
21.  Hetherington J, Lessoway V, Gunka V, Abolmaesumi P, Rohling R. SLIDE: automatic spine level identification system using a deep convolutional neural network. Int J Comput Assist Radiol Surg. 2017;12:1189-1198.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 36]  [Cited by in F6Publishing: 44]  [Article Influence: 6.3]  [Reference Citation Analysis (0)]
22.  Ni X, Li MZ, Zhou SQ, Xu ZD, Zhang YQ, Yu YB, Su J, Zhang LM, Liu ZQ. Accuro ultrasound-based system with computer-aided image interpretation compared to traditional palpation technique for neuraxial anesthesia placement in obese parturients undergoing cesarean delivery: a randomized controlled trial. J Anesth. 2021;35:475-482.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 3]  [Cited by in F6Publishing: 8]  [Article Influence: 2.7]  [Reference Citation Analysis (0)]
23.  Intelligent Ultrasound  ScanNav Anatomy Peripheral Nerve Block. Available from: https://www.intelligentultrasound.com/scannav-anatomy-pnb.  [PubMed]  [DOI]  [Cited in This Article: ]
24.  Bowness JS, Burckett-St Laurent D, Hernandez N, Keane PA, Lobo C, Margetts S, Moka E, Pawa A, Rosenblatt M, Sleep N, Taylor A, Woodworth G, Vasalauskaite A, Noble JA, Higham H. Assistive artificial intelligence for ultrasound image interpretation in regional anaesthesia: an external validation study. Br J Anaesth. 2023;130:217-225.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 11]  [Cited by in F6Publishing: 21]  [Article Influence: 21.0]  [Reference Citation Analysis (0)]
25.  Bowness JS, Macfarlane AJR, Burckett-St Laurent D, Harris C, Margetts S, Morecroft M, Phillips D, Rees T, Sleep N, Vasalauskaite A, West S, Noble JA, Higham H. Evaluation of the impact of assistive artificial intelligence on ultrasound scanning for regional anaesthesia. Br J Anaesth. 2023;130:226-233.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 1]  [Cited by in F6Publishing: 15]  [Article Influence: 15.0]  [Reference Citation Analysis (0)]
26.  Nerveblox  Available from: https://www.nerveblox.com.  [PubMed]  [DOI]  [Cited in This Article: ]
27.  GE Healthcare  Point of Care Ultrasound for Regional Anaesthesia. Available from: https://www.gehealthcare.com/products/ultrasound/venue-family/regional-anesthesia.  [PubMed]  [DOI]  [Cited in This Article: ]
28.  Samsung  News Centre-The NerveTrackTM solution received FDA clearance. Available from: https://www.samsunghealthcare.com/en/about_us/news_center/170.  [PubMed]  [DOI]  [Cited in This Article: ]
29.  Mindray  Diagnostic Ultrasound System, Tex20 Series. Available from: https://www.mindray.com/uk/products/ultrasound/point-of-care/tex20-series.  [PubMed]  [DOI]  [Cited in This Article: ]
30.  Rivanna Medical  Ultrasound Made Simple with Accuro Neuraxial Guidance. Available from: https://rivannamedical.com/why-accuro.  [PubMed]  [DOI]  [Cited in This Article: ]
31.  Kimizuka M, Tokinaga Y, Taguchi M, Takahashi K, Yamakage M. Usefulness and accuracy of a handheld ultrasound device for epidurssal landmark and depth assessment by anesthesiology residents. J Anesth. 2022;36:693-697.  [PubMed]  [DOI]  [Cited in This Article: ]  [Reference Citation Analysis (0)]
32.  Ghisi D, Tomasi M, Giannone S, Luppi A, Aurini L, Toccaceli L, Benazzo A, Bonarelli S. A randomized comparison between Accuro and palpation-guided spinal anesthesia for obese patients undergoing orthopedic surgery. Reg Anesth Pain Med. 2019;rapm-2019.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 11]  [Cited by in F6Publishing: 9]  [Article Influence: 1.8]  [Reference Citation Analysis (0)]
33.  Singla P, Dixon AJ, Sheeran JL, Scalzo D, Mauldin FW Jr, Tiouririne M. Feasibility of Spinal Anesthesia Placement Using Automated Interpretation of Lumbar Ultrasound Images: A Prospective Randomized Controlled Trial. J Anesth Clin Res. 2019;10:878.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 10]  [Cited by in F6Publishing: 11]  [Article Influence: 2.2]  [Reference Citation Analysis (0)]
34.  Carvalho B, Seligman KM, Weiniger CF. The comparative accuracy of a handheld and console ultrasound device for neuraxial depth and landmark assessment. Int J Obstet Anesth. 2019;39:68-73.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 9]  [Cited by in F6Publishing: 10]  [Article Influence: 2.0]  [Reference Citation Analysis (0)]
35.  HiCura  uSINE-Medical-ultrasound-guided spinal landmark identification with needle navigation system. Available from: https://hicuramedical.com/usine-technology.  [PubMed]  [DOI]  [Cited in This Article: ]
36.  Oh TT, Ikhsan M, Tan KK, Rehena S, Han NR, Sia ATH, Sng BL. A novel approach to neuraxial anesthesia: application of an automated ultrasound spinal landmark identification. BMC Anesthesiol. 2019;19:57.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 12]  [Cited by in F6Publishing: 14]  [Article Influence: 2.8]  [Reference Citation Analysis (0)]
37.  Ameri G, Rankin A, Baxter JSH, Moore J, Ganapathy S, Peters TM, Chen ECS. Development and Evaluation of an Augmented Reality Ultrasound Guidance System for Spinal Anesthesia: Preliminary Results. Ultrasound Med Biol. 2019;45:2736-2746.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 8]  [Cited by in F6Publishing: 9]  [Article Influence: 1.8]  [Reference Citation Analysis (0)]
38.  Kim TE, Tsui BCH. Simulation-based ultrasound-guided regional anesthesia curriculum for anesthesiology residents. Korean J Anesthesiol. 2019;72:13-23.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 19]  [Cited by in F6Publishing: 20]  [Article Influence: 3.3]  [Reference Citation Analysis (0)]
39.  Shevlin SP, Turbitt L, Burckett-St Laurent D, Macfarlane AJ, West S, Bowness JS. Augmented Reality in Ultrasound-Guided Regional Anaesthesia: An Exploratory Study on Models With Potential Implications for Training. Cureus. 2023;15:e42346.  [PubMed]  [DOI]  [Cited in This Article: ]  [Reference Citation Analysis (0)]
40.  Needle trainer-Intelligent Ultrasound  Available from: https://www.intelligentultrasound.com/needletrainer.  [PubMed]  [DOI]  [Cited in This Article: ]
41.  Deserno M, D’Oliveira J, Grottke O.   Regional Anaesthesia Simulator and Assistant (RASimAs): Medical Image Processing Supporting Anaesthesiologists in Training and Performance of Local Blocks. In: Proceedings of IEEE 28th International Symposium on Computer-Based Medical Systems: IEEE 28th International Symposium on Computer-Based Medical Systems (CBMS 2015); 2015 June22-25, Sao Paulo, Brazil. 2015: 348-351. Available from: https://d.wanfangdata.com.cn/conference/ChZDb25mZXJlbmNlTmV3UzIwMjQwMTA5EiBhYTBmNzYyN2FkZjBiODI5NWRhZjYxYjI2YjFkNjY0YhoIdnBnNWNsZGY%3D.  [PubMed]  [DOI]  [Cited in This Article: ]
42.  Harrison TK, Kim TE, Kou A, Shum C, Mariano ER, Howard SK; ADAPT (Anesthesiology-Directed Advanced Procedural Training) Research Group. Feasibility of eye-tracking technology to quantify expertise in ultrasound-guided regional anesthesia. J Anesth. 2016;30:530-533.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 21]  [Cited by in F6Publishing: 19]  [Article Influence: 2.4]  [Reference Citation Analysis (0)]
43.  Borg LK, Harrison TK, Kou A, Mariano ER, Udani AD, Kim TE, Shum C, Howard SK; ADAPT (Anesthesiology-Directed Advanced Procedural Training) Research Group. Preliminary Experience Using Eye-Tracking Technology to Differentiate Novice and Expert Image Interpretation for Ultrasound-Guided Regional Anesthesia. J Ultrasound Med. 2018;37:329-336.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 23]  [Cited by in F6Publishing: 17]  [Article Influence: 2.8]  [Reference Citation Analysis (0)]
44.  Chin KJ, Tse C, Chan V, Tan JS, Lupu CM, Hayter M. Hand motion analysis using the imperial college surgical assessment device: validation of a novel and objective performance measure in ultrasound-guided peripheral nerve blockade. Reg Anesth Pain Med. 2011;36:213-219.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 61]  [Cited by in F6Publishing: 50]  [Article Influence: 3.8]  [Reference Citation Analysis (0)]
45.  Kåsine T, Romundstad L, Rosseland LA, Ullensvang K, Fagerland MW, Hol PK, Kessler P, Sauter AR. Needle tip tracking for ultrasound-guided peripheral nerve block procedures-An observer blinded, randomised, controlled, crossover study on a phantom model. Acta Anaesthesiol Scand. 2019;63:1055-1062.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 12]  [Cited by in F6Publishing: 13]  [Article Influence: 2.6]  [Reference Citation Analysis (0)]
46.  Kåsine T, Romundstad L, Rosseland LA, Ullensvang K, Fagerland MW, Kessler P, Bjørnå E, Sauter AR. The effect of needle tip tracking on procedural time of ultrasound-guided lumbar plexus block: a randomised controlled trial. Anaesthesia. 2020;75:72-79.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 9]  [Cited by in F6Publishing: 9]  [Article Influence: 1.8]  [Reference Citation Analysis (0)]
47.  Bowness JS, Metcalfe D, El-Boghdadly K, Thurley N, Morecroft M, Hartley T, Krawczyk J, Noble JA, Higham H. Artificial intelligence for ultrasound scanning in regional anaesthesia: a scoping review of the evidence from multiple disciplines. Br J Anaesth. 2024;132:1049-1062.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 1]  [Reference Citation Analysis (0)]
48.  Oquab M, Bottou L, Laptev I, Sivic J.   Learning and transferring mid-level image representations using convolutional neural networks. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition; 2014 Jun 23-28; Ohio, United States. 2014: 1717-1724. Available from: https://d.wanfangdata.com.cn/conference/ChZDb25mZXJlbmNlTmV3UzIwMjQwMTA5EiBlMTUwMWM3NGZjMTNjNTI1YTg3ZDRlZTRlMzNjNzlhORoIOXFkcG0yczY%3D.  [PubMed]  [DOI]  [Cited in This Article: ]
49.  Cascella M, Tracey MC, Petrucci E, Bignami EG. Exploring Artificial Intelligence in Anesthesia: A Primer on Ethics, and Clinical Applications. Surgeries. 2023;4:264-274.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 2]  [Cited by in F6Publishing: 8]  [Article Influence: 8.0]  [Reference Citation Analysis (0)]
50.  The Alan Turing Institute  Artificial intelligence (safe and ethical). Available from: https://www.turing.ac.uk/research/research-programmes/artificial-intelligence-ai/safe-and-ethical.  [PubMed]  [DOI]  [Cited in This Article: ]