Computer Science

Computer networks.

Computer networks allow computers to communicate with one another, and provide the fundamental infrastructures supporting our modern society. Research on computer networks at Yale improves on essential network system properties such as efficiency, robustness, and programmability. The research spans all networking layers, including application-network integration (ANI); highly robust, flexible networking; software-defined networking (SDN) and programmable networking applications; and mobile networking.

Faculty working in this area:

faculty email website
Anurag Khandelwal
Sohee Park  
Robert Soulé
Lin Zhong

Highlights in this area:

Networking is central to modern computing, from WANs connecting cell phones to massive data stores, to the data-center interconnects that deliver seamless storage and fine-grained distributed computing. Because our distributed computing infrastructure is a key differentiator for the company, Google has long focused on building network infrastructure to support our scale, availability, and performance needs, and to apply our expertise and infrastructure to solve similar problems for Cloud customers. Our research combines building and deploying novel networking systems at unprecedented scale, with recent work focusing on fundamental questions around data center architecture, cloud virtual networking, and wide-area network interconnects. We helped pioneer the use of Software Defined Networking, the application of ML to networking, and the development of large-scale management infrastructure including telemetry systems. We are also addressing congestion control and bandwidth management, capacity planning, and designing networks to meet traffic demands. We build cross-layer systems to ensure high network availability and reliability. By publishing our findings at premier research venues, we continue to engage both academic and industrial partners to further the state of the art in networked systems.

Recent Publications

Some of our teams.

Cloud networking

Global networking

Network infrastructure

We're always looking for more talented, passionate people.

Careers

ACM Digital Library home

  • Advanced Search

Machine Learning for Computer Systems and Networking: A Survey

New citation alert added.

This alert has been successfully added and will be sent to:

You will be notified whenever a record that you have chosen has been cited.

To manage your alert preferences, click on the button below.

New Citation Alert!

Please log in to your account

Information & Contributors

Bibliometrics & citations, view options, 1 introduction.

research on computer network

2.1 Problem Space

2.2 solution space, 2.3 classification of selected works.

SolutionParadigmsEnvironmentTemporality
SLUSLRLCentralizedDistributedOfflineOnline
       
LSTM Hardware Prefetcher [ ]   
Learning Access Patterns [ ]  
Compact Prefetcher [ ]    
Kleio [ ]   
Lightweight Caching [ ]    
RL-Cache [ ]    
       
DeepRM [ ]    
Device Placement [ ]    
Decima [ ]    
       
Learned Index Structures [ ]    
SkinnerDb [ ]    
DQ [ ]    
State Representations [ ]    
MSCN [ ]    
Neo [ ]    
       
Deep Packet [ ]    
NeuroCuts [ ]    
       
Learning to Route [ ]    
DQRC [ ]    
       
Remy [ ]    
Vivace [ ]    
Aurora [ ]    
Orca [ ]    
DRL-CC [ ]    
       
CS2P [ ]    
Pensieve [ ]    

3 Memory/cache Management

research on computer network

3.1 Traditional Approaches and Limitations

3.2 ml-based approaches, 3.2.1 memory prefetchers., 3.2.2 page scheduling., 3.2.3 cache admission and eviction in cdns., 3.3 discussion on ml-based approaches, 4 cluster resource scheduling, 4.1 traditional approaches and limitations, 4.2 ml-based approaches, 4.3 discussion on ml-based approaches, 5 query optimization in database systems, 5.1 traditional approaches and limitations, 5.2 ml-based approaches, 5.2.1 index structure optimization., 5.2.2 cardinality estimation., 5.2.3 join ordering., 5.2.4 end-to-end query optimization., 5.3 discussion on ml-based approaches, 6 network packet classification.

research on computer network

6.1 Traditional Approaches and Limitations

6.2 ml-based approaches, 6.3 discussion on ml-based approaches, 7 network routing, 7.1 traditional approaches and limitations, 7.2 ml-based approaches, 7.3 discussion on ml-based approaches, 8 congestion control, 8.1 traditional approaches and limitations, 8.2 ml-based approaches, 8.3 discussion on ml-based approaches, 9 adaptive video streaming.

research on computer network

9.1 Traditional Approaches and Limitations

9.2 ml-based approaches, 9.3 discussion on ml-based approaches, 10 discussion and future directions, acknowledgments, index terms.

Computer systems organization

General and reference

Document types

Surveys and overviews

Recommendations

A survey of machine learning for computer architecture and systems.

It has been a long time that computer architecture and systems are optimized for efficient execution of machine learning (ML) models. Now, it is time to reconsider the relationship between ML and systems and let ML transform the way that computer ...

Machine Learning: The State of the Art

The two fundamental problems in machine learning (ML) are statistical analysis and algorithm design. The former tells us the principles of the mathematical models that we establish from the observation data. The latter defines the conditions on which ...

Machine learning in computer forensics (and the lessons learned from machine learning in computer security)

In this paper, we discuss the role that machine learning can play in computer forensics. We begin our analysis by considering the role that machine learning has gained in computer security applications, with the aim of aiding the computer forensics ...

Information

Published in.

cover image ACM Computing Surveys

University of Sydney, Australia

Association for Computing Machinery

New York, NY, United States

Publication History

Permissions, check for updates, author tags.

  • Machine learning
  • computer systems
  • computer networking

Funding Sources

  • Dutch Research Council (NWO)
  • Open Competition Domain Science
  • German Research Foundation (DFG)
  • Collaborative Research Center (CRC)

Contributors

Other metrics, bibliometrics, article metrics.

  • 0 Total Citations
  • 3,526 Total Downloads
  • Downloads (Last 12 months) 2,307
  • Downloads (Last 6 weeks) 312

View options

View or Download as a PDF file.

View online with eReader .

HTML Format

View this article in HTML Format.

Login options

Check if you have access through your login credentials or your institution to get full access on this article.

Full Access

Share this publication link.

Copying failed.

Share on social media

Affiliations, export citations.

  • Please download or close your previous search result export first before starting a new bulk export. Preview is not available. By clicking download, a status dialog will open to start the export process. The process may take a few minutes but once it finishes a file will be downloadable from your browser. You may continue to browse the DL while the export process is in progress. Download
  • Download citation
  • Copy citation

We are preparing your search results for download ...

We will inform you here when the file is ready.

Your file of search results citations is now ready.

Your search export query has expired. Please try again.

IEEE Account

  • Change Username/Password
  • Update Address

Purchase Details

  • Payment Options
  • Order History
  • View Purchased Documents

Profile Information

  • Communications Preferences
  • Profession and Education
  • Technical Interests
  • US & Canada: +1 800 678 4333
  • Worldwide: +1 732 981 0060
  • Contact & Support
  • About IEEE Xplore
  • Accessibility
  • Terms of Use
  • Nondiscrimination Policy
  • Privacy & Opting Out of Cookies

A not-for-profit organization, IEEE is the world's largest technical professional organization dedicated to advancing technology for the benefit of humanity. © Copyright 2024 IEEE - All rights reserved. Use of this web site signifies your agreement to the terms and conditions.

CS 243, Fall 2019: Advanced Computer Networks

This is a graduate-level course on computer networks. It provides a comprehensive overview on advanced topics in network protocols and networked systems. The course will cover both classic papers on computer networks and recent research results. It will examine a wide range of topics including routing, congestion control, network architectures, network management, datacenter networks, software-defined networking, and programmable networks, with an emphasis on core networking concepts and principles and their usage in practice. The course will include lectures, in-class presentations, paper discussions, and a research project.

  • Instructor: Minlan Yu (MD 137)
  • Lecture time: MW 1:30pm-2:45pm
  • Location: TBD
  • Office hour: We have unlimited office hours to discuss course projects. Please just email the instructor to schedule the time. We also have walk-in office hour time on Monday 12:30-1:30, MD 137
  • Discussion list: Piazza
  • Recommended prep: system programming at the level of CS 61 or CS 143 or CS 145.
  • Project: 50%
  • Reviews: 30%
  • Class presentation: 20%

Review format

Class presentation, presentation format, project timeline, project proposal presentation, midterm project report, final project presentations, final project report, code submission, evaluation testbed, diversity and inclusion, accommodations for disabilities.

Last updated: 2019-10-16 11:59:11 -0400 [ validate xhtml ]

CrowJack

  • Calculators
  • Swot Analysis
  • Pestle Analysis
  • Five Forces Analysis
  • Organizational Structure
  • Copywriting
  • Research Topics
  • Student Resources

CrowJack

Services We Provide

proof-reading

Resources We Provide

blog

Login / Register

login

  • 15 Latest Networking Research Topics for Students

Kiara Miller - Image

Comparative analysis between snort and suricata IDS software(s)

Description of the topic

The main focus of this research is to conduct a comparative analysis between Snort and Suricata software to determine which IDS software can provide better performance. There are various IDS software(s) available that can be used by organizations but it is difficult to identify which one is best (Aldarwbi et al., 2022). Different organizational structures are often facing problems while setting up an IDS system which results in false positives and intrusions. Through this research, it can be identified which IDS software is better and what secure configuration is required to detect intrusions (Waleed et al., 2022).

Research objectives

  • To evaluate Snort and Suricata IDS software(s) to determine the most optimal one.
  • To identify the false positive rate of Snort and Suricata on the networked environment.

Research questions

RQ1: Which IDS software can perform better on the production network in terms of performance, security, scalability and reliability?

RQ2: What different ways can be followed to deal with false positive problems in IDS technology?

Research methodology

The given research objectives and research questions can be addressed using quantitative research methodology where an experimental approach can be followed. For the given topic, both Snort and Suricata IDS systems should be configured and tested against different attacks. Depending on the findings, it can be analyzed which IDS software can perform better in terms of performance and security (Shuai & Li, 2021).

  • Aldarwbi, M.Y., Lashkari, A.H. and Ghorbani, A.A. (2022) “The sound of intrusion: A novel network intrusion detection system,” Computers and Electrical Engineering , 104, p. 108455.
  • Shuai, L. and Li, S. (2021) “Performance optimization of Snort based on DPDK and Hyperscan,” Procedia Computer Science , 183, pp. 837-843.
  • Waleed, A., Jamali, A.F. and Masood, A. (2022) “Which open-source ids? Snort, Suricata or Zeek,” Computer Networks , 213, p. 109116.

Role of honeypots and honey nets in network security

Network Security has become essential nowadays and there is a need for setting up robust mechanisms to maintain confidentiality and integrity (Feng et al., 2023). Due to the number of security mechanisms available, organizations found it hard to finalize and implement them on their network. For example, honey pots and honeynet approaches look almost the same and have the same purpose but work differently. Under this research topic, the configuration of honeynets and honeypots can be done to check which one can perform better security in terms of trapping cyber attackers. The entire implementation can be carried out in the cloud-based instance for improved security and it can be identified which type of honey pot technology must be preferred (Maesschalck et al., 2022).

  • To set up a honey pot system using Open Canary on the virtual instance to protect against cyber attackers.
  • To set up a honeynet system on the virtual instance to assure protection is provided against malicious attackers.
  • To test honeypots and honeynets by executing DDoS attacks to check which can provide better security.

RQ1: Why is there a need for using honeypots over honey pots in a production networked environment?

RQ2: What are the differences between cloud-based and local honey pot systems for endpoint protection?

This research can be carried out using the quantitative method of research. At the initial stage, the implementation of honeypots and honeypots can be done on the virtual instance following different security rules. Once the rules are applied, the testing can be performed using a Kali Linux machine to check whether honey pots were effective or honeynets (Gill et al., 2020).

  • Feng, H. et al. (2023) “Game theory in network security for Digital Twins in industry,” Digital Communications and Networks [Preprint].
  • Gill, K.S., Saxena, S. and Sharma, A. (2020) “GTM-CSEC: A game theoretic model for cloud security based on ids and Honeypot,” Computers & Security , 92, p. 101732
  • Maesschalck, S. et al. (2022) “Don’t get stung, cover your ICS in honey: How do honeypots fit within industrial control system security,” Computers & Security , 114, p. 102598.

How do malware variants are progressively improving?

This research can be based on evaluating how malware variants are progressively improving and what should be its state in the coming future. Malware is able to compromise confidential user’s information assets which is why this research can be based on identifying current and future consequences owing to its improvements (Deng et al., 2023). In this field, there is no research work that has been carried out to identify how malware variants are improving their working and what is expected to see in future. Once the evaluation is done, a clear analysis can also be done on some intelligent preventive measures to deal with dangerous malware variants and prevent any kind of technological exploitation (Tang et al., 2023).

  • To investigate types of malware variants available to learn more about malware's hidden features.
  • To focus on future implications of malware executable programs and how they can be avoided.
  • To discuss intelligent solutions to deal with all malware variants.

RQ1: How do improvements in malware variants impact enterprises?

RQ2: What additional solutions are required to deal with malware variants?

In this research, qualitative analysis can be conducted on malware variants and the main reason behind their increasing severity. The entire research can be completed based on qualitative research methodology to answer defined research questions and objectives. Some real-life case studies should also be integrated into the research which can be supported by the selected topic (Saidia Fasci et al., 2023).

  • Deng, H. et al. (2023) “MCTVD: A malware classification method based on three-channel visualization and deep learning,” Computers & Security , 126, p. 103084.
  • Saidia Fasci, L. et al. (2023) “Disarming visualization-based approaches in malware detection systems,” Computers & Security , 126, p. 103062.
  • Tang, Y. et al. (2023) “BHMDC: A byte and hex n-gram based malware detection and classification method,” Computers & Security , p. 103118.

Implementation of IoT - enabled smart office/home using cisco packet tracer

The Internet of Things has gained much more attention over the past few years which is why each enterprise and individual aims at setting up an IoT network to automate their processes (Barriga et al., 2023). This research can be based on designing and implementing an IoT-enabled smart home/office network using Cisco Packet Tracer software. Logical workspace, all network devices, including IoT devices can be used for preparing a logical network star topology (Elias & Ali, 2014). To achieve automation, the use of different IoT rules can be done to allow devices to work based on defined rules.

  • To set up an IoT network on a logical workspace using Cisco Packet Tracer simulation software.
  • To set up IoT-enabled rules on an IoT registration server to achieve automation (Hou et al., 2023).

RQ: Why is the Cisco packet tracer preferred for network simulation over other network simulators?

At the beginning of this research, a quantitative research methodology can be followed where proper experimental set-up can be done. As a packet tracer is to be used, the star topology can be used to interconnect IoT devices, sensors and other network devices at the home/office. Once a placement is done, the configuration should be done using optimal settings and all IoT devices can be connected to the registration server. This server will have IoT rules which can help in achieving automation by automatically turning off lights and fans when no motion is detected (Baggan et al., 2022).

  • Baggan, V. et al. (2022) “A comprehensive analysis and experimental evaluation of Routing Information Protocol: An Elucidation,” Materials Today: Proceedings , 49, pp. 3040–3045.
  • Barriga, J.A. et al. (2023) “Design, code generation and simulation of IOT environments with mobility devices by using model-driven development: Simulateiot-Mobile,” Pervasive and Mobile Computing , 89, p. 101751.
  • Elias, M.S. and Ali, A.Z. (2014) “Survey on the challenges faced by the lecturers in using packet tracer simulation in computer networking course,” Procedia - Social and Behavioral Sciences , 131, pp. 11–15.
  • Hou, L. et al. (2023) “Block-HRG: Block-based differentially private IOT networks release,” Ad Hoc Networks , 140, p. 103059.

Comparative analysis between AODV, DSDV and DSR routing protocols in WSN networks

For wireless sensor networks (WSN), there is a major need for using WSN routing rather than performing normal routines. As WSN networks are self-configured, there is a need for an optimal routing protocol that can improve network performance in terms of latency, jitter, and packet loss (Luo et al., 2023). There are often various problems faced when WSN networks are set up due to a lack of proper routing protocol selection. As a result of this, severe downtime is faced and all links are not able to communicate with each other easily (Hemanand et al., 2023). In this research topic, the three most widely used WSN routing protocols AODV, DSDV and DSR can be compared based on network performance. To perform analysis, three different scenarios can be created in network simulator 2 (Ns2).

  • To create three different scenarios on ns2 software to simulate a network for 1 to 100 seconds.
  • To analyze which WSN routing is optimal in terms of network performance metrics, including latency, jitter and packet loss.
  • To use CBR and NULL agents for all wireless scenarios to start with simulation purposes.

RQ: How do AODV, DSR and DSDV routing protocols differ from each other in terms of network performance?

This research can be carried out using a quantitative research method. The implementation for the provided research topic can be based on Ns2 simulation software where three different scenarios can be created (AODV, DSDV and DSR). For each scenario, NULL, CSR and UDP agents can be done to start with simulation for almost 1 to 100 seconds. For all transmissions made during the given time, network performance can be checked to determine which routing is best (Mohapatra & Kanungo, 2012).

  • Human and, D. et al. (2023) “Analysis of power optimization and enhanced routing protocols for Wireless Sensor Networks,” Measurement: Sensors , 25, p. 100610. Available at: https://doi.org/10.1016/j.measen.2022.100610.
  • Luo, S., Lai, Y. and Liu, J. (2023) “Selective forwarding attack detection and network recovery mechanism based on cloud-edge cooperation in software-defined wireless sensor network,” Computers & Security , 126, p. 103083. Available at: https://doi.org/10.1016/j.cose.2022.103083.
  • Mohapatra, S. and Kanungo, P. (2012) “Performance analysis of AODV, DSR, OLSR and DSDV routing protocols using NS2 Simulator,” Procedia Engineering , 30, pp. 69–76. Available at: https://doi.org/10.1016/j.proeng.2012.01.835.

Securing wireless network using AAA authentication and WLAN controller

Wireless networks often face intrusion attempts due to insecure protocols and sometimes open SSIDs. As a result of this, man-in-the-middle and eavesdropping attacks become easier which results in the loss of confidential information assets (Sivasankari & Kamalakkannan, 2022). When it comes to managing networks in a large area, there are higher chances for attacks that enable cyber attackers in intercepting ongoing communication sessions. However, there is currently no research conducted where the use of AAA authentication has been done with WLAN controllers to make sure a higher level of protection is provided (Nashwan, 2021). The proposed research topic can be based on securing wireless networks with the help of AAA authentication and WLAN controllers. The use of AAA authentication can be done to set up a login portal for users whilst the WLAN controller can be used for managing all wireless access points connected to the network (Nashwan, 2021).

  • To set up AAA authentication service on the wireless network simulated on Cisco Packet Tracer for proper access control.
  • To set up a WLAN controller on the network to manage all wireless access points effortlessly.
  • To use WPA2-PSK protocol on the network to assure guest users are only able to access wireless networks over a secure protocol.

RQ1: What additional benefits are offered by AAA authentication on the WLAN networks?

RQ2: Why are wireless networks more likely to face network intrusions than wired networks?

This research topic is based on the secure implementation of a wireless LAN network using a Cisco packet tracer. Hence, this research can be carried out using a quantitative research method. The implementation can be carried out using AAA authentication which can assure that access control is applied for wireless logins. On the other hand, a WLAN controller can also be configured which can ensure that all WAPs are managed (ZHANG et al., 2012).

  • Nashwan, S. (2021) “AAA-WSN: Anonymous Access Authentication Scheme for wireless sensor networks in Big Data Environment,” Egyptian Informatics Journal , 22(1), pp. 15–26.
  • Sivasankari, N. and Kamalakkannan, S. (2022) “Detection and prevention of man-in-the-middle attack in IOT network using regression modeling,” Advances in Engineering Software , 169, p. 103126.
  • ZHANG, J. et al. (2012) “AAA authentication for Network mobility,” The Journal of China Universities of Posts and Telecommunications , 19(2), pp. 81-86.

OWASP's approach to secure web applications from web application exploits

The research can revolve around the development of web applications by considering OWASP's top 10 rules. Usually, web applications are deployed by organizations depending on their requirements and these applications are vulnerable to various exploits, including injection, broken authentication and other forgery attacks (Poston, 2020). Identifying every single vulnerability is difficult when reference is not taken and often organizations end up hosting a vulnerable server that leads to privacy issues and compromises confidential information easily. In this research, OWASP's top 10 approaches can be followed to develop a secure web application that can be able to protect against top web application exploits. This approach is based on emphasizing severe and minor vulnerabilities which must be patched for protecting against web application attacks (Deepa & Thilagam, 2016).

  • The first objective can be setting up an insecure web application on the cloud environment which can be exploited with different techniques.
  • The second objective can be to consider all techniques and procedures provided by OWASP's top 10 methodologies.
  • The last objective can be applying all fixes to insecure web applications to make them resistant to OWASP top 10 attacks (Sonmez, 2019).

RQ1: What are the benefits of using OWASP's top 10 approaches to harden web applications in comparison to other security approaches?

The research methodology considered for this research project can be quantitative using an experimental approach. The practical work can be done for the selected topic using AWS or the Azure cloud platform. Simply, a virtual web server can be configured and set up with a secure and insecure web application. Following OWASP's top 10 techniques and procedures, the web application can be secured from possible attacks. In addition, insecure applications can also be exploited and results can be evaluated (Applebaum et al., 2021).

  • Applebaum, S., Gaber, T. and Ahmed, A. (2021) “Signature-based and machine-learning-based web application firewalls: A short survey,” Procedia Computer Science , 189, pp. 359–367. Available at: https://doi.org/10.1016/j.procs.2021.05.105.
  • Deepa, G. and Thilagam, P.S. (2016) “Securing web applications from injection and logic vulnerabilities: Approaches and challenges,” Information and Software Technology , 74, pp. 160–180. Available at: https://doi.org/10.1016/j.infsof.2016.02.005.
  • Poston, H. (2020) “Mapping the owasp top Ten to the blockchain,” Procedia Computer Science , 177, pp. 613-617. Available at: https://doi.org/10.1016/j.procs.2020.10.087.
  • Sonmez, F.Ö. (2019) “Security qualitative metrics for Open Web Application Security Project Compliance,” Procedia Computer Science , 151, pp. 998-1003. Available at: https://doi.org/10.1016/j.procs.2019.04.140.

Importance of configuring RADIUS (AAA) server on the network

User authentication has become significant nowadays as it guarantees that a legitimate user is accessing the network. But a problem is faced when a particular security control is to be identified for authentication and authorization. These controls can be categorized based on mandatory access controls, role-based access control, setting up captive portals and many more. Despite several other security controls, one of the most efficient ones is the RADIUS server (SONG et al., 2008). This server can authenticate users on the network to make sure network resources are accessible to only legal users. This research topic can be based on understanding the importance of RADIUS servers on the network which can also be demonstrated with the help of the Cisco Packet Tracer. A network can be designed and equipped with a RADIUS server to ensure only legal users can access network resources (WANG et al., 2009).

  • To configure RADIUS (AAA) server on the network which can be able to authenticate users who try to access network resources.
  • To simulate a network on a packet tracer simulation software and verify network connectivity.

RQ1: What are other alternatives to RADIUS (AAA) authentication servers for network security?

RQ2: What are the common and similarities between RADIUS and TACACS+ servers?

As a logical network is to be designed and configured, a quantitative research methodology can be followed. In this research coursework, a secure network design can be done using a packet tracer network simulator, including a RADIUS server along with the DMZ area. The configuration for the RADIUS server can be done to allow users to only access network resources by authenticating and authorizing (Nugroho et al., 2022).

  • Nugroho, Y.S. et al. (2022) “Dataset of network simulator related-question posts in stack overflow,” Data in Brief , 41, p. 107942.
  • SONG, M., WANG, L. and SONG, J.-de (2008) “A secure fast handover scheme based on AAA protocol in Mobile IPv6 Networks,” The Journal of China Universities of Posts and Telecommunications , 15, pp. 14-18.
  • WANG, L. et al. (2009) “A novel congestion control model for interworking AAA in heterogeneous networks,” The Journal of China Universities of Posts and Telecommunications , 16, pp. 97-101.

Comparing mod security and pF sense firewall to block illegitimate traffic

Firewalls are primarily used for endpoint security due to their advanced features ranging from blocking to IDS capabilities and many more. It is sometimes challenging to identify which type of firewall is best and due to this reason, agencies end up setting up misconfigured firewalls (Tiwari et al., 2022). This further results in a cyber breach, destroying all business operations. The research can be emphasizing conducting a comparison between the two most widely used firewalls i.e. Mod Security and pF sense. Using a virtualized environment, both firewalls can be configured and tested concerning possible cyber-attacks (Lu & Yang, 2020).

  • To use the local environment to set up Mod security and pF sense firewall with appropriate access control rules.
  • To test both firewalls by executing distributed denial of service attacks from a remote location.
  • To compare which type of firewall can provide improved performance and robust security.

RQ: How do Mod security and pF sense differ from each other in terms of features and performance?

The practical experimentation for both firewalls can be done using a virtualized environment where two different machines can be created. Hence, this research can be carried out using a quantitative research method . The first machine can have Mod security and the second machine can have pF sense configured. A new subnet can be created which can have these two machines. The third machine can be an attacking machine which can be used for testing firewalls. The results obtained can be then evaluated to identify which firewall is best for providing security (Uçtu et al., 2021).

  • Lu, N. and Yang, Y. (2020) “Application of evolutionary algorithm in performance optimization of Embedded Network Firewall,” Microprocessors and Microsystems , 76, p. 103087.
  • Tiwari, A., Papini, S. and Hemamalini, V. (2022) “An enhanced optimization of parallel firewalls filtering rules for scalable high-speed networks,” Materials Today: Proceedings , 62, pp. 4800-4805.
  • Uçtu, G. et al. (2021) “A suggested testbed to evaluate multicast network and threat prevention performance of Next Generation Firewalls,” Future Generation Computer Systems , 124, pp. 56-67.

Conducting a comprehensive investigation on the PETYA malware

The main purpose of this research is to conduct a comprehensive investigation of the PETYA malware variant (McIntosh et al., 2021). PETYA often falls under the category of ransomware attacks which not only corrupt and encrypt files but can compromise confidential information easily. Along with PETYA, there are other variants also which lead to a security outage and organizations are not able to detect these variants due to a lack of proper detection capabilities (Singh & Singh, 2021). In this research, a comprehensive analysis has been done on PETYA malware to identify its working and severity level. Depending upon possible causes of infection of PETYA malware, some proactive techniques can also be discussed (Singh & Singh, 2021). A separation discussion can also be made on other malware variants, their features, and many more.

  • The main objective of this research is to scrutinize the working of PETYA malware because a ransomware attack can impact the micro and macro environment of the organizations severely.
  • The working of PETYA malware along with its source code can be reviewed to identify its structure and encryption type.
  • To list all possible CVE IDs which are exploited by the PETYA malware.

RQ1: How dangerous is PETYA malware in comparison to other ransomware malware?

This research can be based on qualitative research methodology to evaluate the working of PETYA malware from various aspects, the methodology followed and what are its implications. The research can be initiated by evaluating the working of PETYA malware, how it is triggered, what encryption is applied and other factors. A sample source code can also be analyzed to learn more about how cryptography is used with ransomware (Abijah Roseline & Geetha, 2021).

  • Abijah Roseline, S. and Geetha, S. (2021) “A comprehensive survey of tools and techniques mitigating computer and mobile malware attacks,” Computers & Electrical Engineering , 92, p. 107143.
  • McIntosh, T. et al. (2021) “Enforcing situation-aware access control to build malware-resilient file systems,” Future Generation Computer Systems , 115, pp. 568-582.
  • Singh, J. and Singh, J. (2021) “A survey on machine learning-based malware detection in executable files,” Journal of Systems Architecture , 112, p. 101861.

Setting up a Live streaming server on the cloud platform

Nowadays, various organizations require a live streaming server to stream content depending upon their business. However, due to a lack of proper hardware, organizations are likely to face high network congestion, slowness and other problems (Ji et al., 2023). Referring to the recent cases, it has been observed that setting up a streaming server on the local environment is not expected to perform better than a cloud-based streaming server configuration (Martins et al., 2019). This particular research topic can be based on setting up a live streaming server on the AWS or Azure cloud platform to make sure high network bandwidth is provided with decreased latency. The research gap analysis would be conducted to analyze the performance of live streaming servers on local and cloud environments in terms of network performance metrics (Bilal et al., 2018).

  • To set up a live streaming server on the AWS or Azure cloud platform to provide live streaming services.
  • To use load balancers alongside streaming servers to ensure the load is balanced and scalability is achieved.
  • To use Wireshark software to test network performance during live streaming.

RQ1: Why are in-house streaming servers not able to provide improved performance in comparison to cloud-based servers?

RQ2: What additional services are provided by cloud service providers which help in maintaining network performance?

The implementation is expected to carry out on the AWS cloud platform with other AWS services i.e. load balancer, private subnet and many more (Efthymiopoulou et al., 2017). Hence, this research can be carried out using a quantitative research method. The configuration of ec2 instances can be done which can act as a streaming server for streaming media and games. For testing this project, the use of OBS studio can be done which can help in checking whether streaming is enabled or not. For network performance, Wireshark can be used for testing network performance (George et al., 2020).

  • Bilal, KErbad, A. and Hefeeda, M. (2018) “QoE-aware distributed cloud-based live streaming of multi-sourced Multiview Videos,” Journal of Network and Computer Applications , 120, pp. 130-144.
  • Efthymiopoulou, M. et al. (2017) “Robust control in cloud-assisted peer-to-peer live streaming systems,” Pervasive and Mobile Computing , 42, pp. 426-443.
  • George, L.C. et al. (2020) “Usage visualization for the AWS services,” Procedia Computer Science , 176, pp. 3710–3717.
  • Ji, X. et al. (2023) “Adaptive QoS-aware multipath congestion control for live streaming,” Computer Networks , 220, p. 109470.
  • Martins, R. et al. (2019) “Iris: Secure reliable live-streaming with Opportunistic Mobile Edge Cloud offloading,” Future Generation Computer Systems , 101, pp. 272-292.

Significance of using OSINT framework for Network reconnaissance

Network reconnaissance is becoming important day by day when it comes to penetration testing. Almost all white hat hackers are dependent on the OSINT framework to start with network reconnaissance and footprinting when it comes to evaluating organizational infrastructure. On the other hand, cyber attackers are also using this technique to start fetching information about their target. Currently, there is no investigation carried out to identify how effective the OSINT framework is over traditional reconnaissance activities (Liu et al., 2022). This research is focused on using OSINT techniques to analyze victims using different sets of tools like Maltego, email analysis and many other techniques. The analysis can be based on fetching sensitive information about the target which can be used for conducting illegal activities (Abdullah, 2019).

  • To use Maltego software to conduct network reconnaissance on the target by fetching sensitive information.
  • To compare the OSINT framework with other techniques to analyze why it performs well.

RQ1: What is the significance of using the OSINT framework in conducting network reconnaissance?

RQ2: How can the OSINT framework be used by cyber hackers for conducting illegitimate activities?

The OSINT framework is easily accessible on its official website where different search options are given. Hence, this research can be carried out using a quantitative research method. Depending upon the selected target, each option can be selected and tools can be shortlisted for final implementation. Once the tools are shortlisted, they can be used to conduct network reconnaissance (González-Granadillo et al., 2021). For example, Maltego can be used as it is a powerful software to fetch information about the target.

  • Abdullah, S.A. (2019) “Seui-64, bits an IPv6 addressing strategy to mitigate reconnaissance attacks,” Engineering Science and Technology , an International Journal, 22(2), pp. 667–672.
  • Gonzalez-Granadillo, G. et al. (2021) “ETIP: An enriched threat intelligence platform for improving OSINT correlation, analysis, visualization and sharing capabilities,” Journal of Information Security and Applications , 58, p. 102715.
  • Liu, W. et al. (2022) “A hybrid optimization framework for UAV Reconnaissance Mission Planning,” Computers & Industrial Engineering , 173, p. 108653.

Wired and wireless network hardening in cisco packet tracer

At present, network security has become essential and if enterprises are not paying attention to the security infrastructure, there are several chances for cyber breaches. To overcome all these issues, there is a need for setting up secure wired and wireless networks following different techniques such as filtered ports, firewalls, VLANs and other security mechanisms. For the practical part, the use of packet tracer software can be done to design and implement a highly secure network (Sun, 2022).

  • To use packet tracer simulation software to set up secure wired and wireless networks.
  • Use different hardening techniques, including access control rules, port filtering, enabling passwords and many more to assure only authorized users can access the network (Zhang et al., 2012).

RQ: Why is there a need for emphasizing wired and wireless network security?

Following the quantitative approach, the proposed research topic implementation can be performed in Cisco Packet Tracer simulation software. Several devices such as routers, switches, firewalls, wireless access points, hosts and workstations can be configured and interconnected using Cat 6 e cabling. For security, every device can be checked and secure design principles can be followed like access control rules, disabled open ports, passwords, encryption and many more (Smith & Hasan, 2020).

  • Smith, J.D. and Hasan, M. (2020) “Quantitative approaches for the evaluation of Implementation Research Studies,” Psychiatry Research , 283, p. 112521.
  • Sun, J. (2022) “Computer Network Security Technology and prevention strategy analysis,” Procedia Computer Science , 208, pp. 570–576.
  • Zhang, YLiang, R. and Ma, H. (2012) “Teaching innovation in computer network course for undergraduate students with a packet tracer,” IERI Procedia , 2, pp. 504–510.

Different Preemptive ways to resist spear phishing attacks

When it comes to social engineering, phishing attacks are rising and are becoming one of the most common ethical issues as it is one of the easiest ways to trick victims into stealing information. This research topic is based on following different proactive techniques which would help in resisting spear phishing attacks (Xu et al., 2023). This can be achieved by using the Go-Phish filter on the machine which can automatically detect and alert users as soon as the phished URL is detected. It can be performed on the cloud platform where the apache2 server can be configured along with an anti-phishing filter to protect against phishing attacks (Yoo & Cho, 2022).

  • To set up a virtual instance on the cloud platform with an apache2 server and anti-phishing software to detect possible phishing attacks.
  • To research spear phishing and other types of phishing attacks that can be faced by victims (Al-Hamar et al., 2021).

RQ1: Are phishing attacks growing just like other cyber-attacks?

RQ2: How effective are anti-phishing filters in comparison to cyber awareness sessions?

The entire research can be conducted by adhering to quantitative research methodology which helps in justifying all research objectives and questions. The implementation of the anti-phishing filter can be done by creating a virtual instance on the cloud platform which can be configured with an anti-phishing filter. Along with this, some phishing attempts can also be performed to check whether the filter works or not (Siddiqui et al., 2022).

  • Al-Hamar, Y. et al. (2021) “Enterprise credential spear-phishing attack detection,” Computers & Electrical Engineering , 94, p. 107363.
  • Siddiqui, N. et al. (2022) “A comparative analysis of US and Indian laws against phishing attacks,” Materials Today: Proceedings , 49, pp. 3646–3649.
  • Xu, T., Singh, K. and Rajivan, P. (2023) “Personalized persuasion: Quantifying susceptibility to information exploitation in spear-phishing attacks,” Applied Ergonomics , 108, p. 103908.
  • Yoo, J. and Cho, Y. (2022) “ICSA: Intelligent chatbot security assistant using text-CNN and multi-phase real-time defense against SNS phishing attacks,” Expert Systems with Applications , 207, p. 117893.

Evaluating the effectiveness of distributed denial of service attacks

The given research topic is based on evaluating the effectiveness of distributed denial of service attacks on cloud and local environments. Hence, this research can be carried out using a quantitative research method. Cyber attackers find DDoS as one of the most dangerous technological exploitation when it comes to impacting network availability (Krishna Kishore et al., 2023). This research can revolve around scrutinizing the impact of DDoS attacks on the local environment and cloud environment. This can be done by executing DDoS attacks on a simulated environment using hoping or other software(s) to check where it has a higher magnitude (de Neira et al., 2023).

  • To set up a server on the local and cloud environment to target using DDoS attacks for checking which had experienced slowness.
  • To determine types of DDoS attack types, their magnitude and possible mitigation techniques.

RQ: Why do DDoS attacks have dynamic nature and how is it likely to sternly impact victims?

The experimentation for this research can be executed by creating a server on the local and cloud environment. Hence, this research can be carried out using a quantitative research method. These servers can be set up as web servers using apache 2 service. On the other hand, a Kali Linux machine can be configured with DDoS execution software. Each server can be targeted with DDoS attacks to check its effectiveness (Benlloch-Caballero et al., 2023).

  • Benlloch-Caballero, P., Wang, Q. and Alcaraz Calero, J.M. (2023) “Distributed dual-layer autonomous closed loops for self-protection of 5G/6G IOT networks from distributed denial of service attacks,” Computer Networks , 222, p. 109526.
  • de Neira, A.B., Kantarci, B. and Nogueira, M. (2023) “Distributed denial of service attack prediction: Challenges, open issues and opportunities,” Computer Networks , 222, p. 109553.
  • Krishna Kishore, P., Ramamoorthy, S. and Rajavarman, V.N. (2023) “ARTP: Anomaly-based real time prevention of distributed denial of service attacks on the web using machine learning approach,” International Journal of Intelligent Networks , 4, pp. 38–45.

Recommended Readings

Latest Web Development Research Topics

Top Management Research Topics

Newest AI Research Topics

15 Latest Networking Research Topics for Students

Research in every field is becoming more and more essential because of constant developments around the world. Similar is the case in the field of networking. This is the reason; students who are preparing to master the field of networking need to keep their knowledge of the current state of the art in the field up to date.

However, choosing the right research topic often becomes a tough task for students to carry out their research effectively. That being the case, this list contains 15 latest research topics in the field of networking. Whether you are a seasoned researcher or just starting, this list can provide you with ample inspiration and guidance to drive your research forward in the dynamic and evolving field of Networking.

Facebook

Copyright © 2023 CrowJack. All Rights Reserved

  • MyU : For Students, Faculty, and Staff

UMN Networking Research Group

The Computer Networking Research Lab in the Department of Computer Sciences and Engineering at the University of Minnesota is led by Professor Zhi-Li Zhang. We conduct research on a broad range of topics related to computer and communication networks, helping transform the current best-effort Internet to a more reliable, available, and secure information infrastructure for all kinds of communication activities. Our research spans across areas such as Software-defined Networking (SDN), Network Function Virtualization (NFV), Content Distribution Networks (CDNs), Routing, Network Resource Management, Network Architecture Design, ISP Traffic Analysis, Wireless Network Modeling, Network Robustness, Network Economics and more. In carrying out our research, we blend formal modeling/analysis, experimentation/implementation, and testing/evaluation. Please follow the link Projects to browse the on-going projects in the lab.

  • Smart Cloud Commuting Service
  • 5G Measurements
  • Graph Neural Networks
  • Resilient Routing
  • Smart Caches
  • Cellular Data Networks
  • Publications

Illustration with collage of pictograms of gear, robotic arm, mobile phone

Updated: 1 July 2024

Networking, or computer networking, is the process of connecting two or more computing devices, such as desktop computers, mobile devices, routers or applications, to enable the transmission and exchange of information and resources.

Networked devices rely on communications protocols—rules that describe how to transmit or exchange data across a network—to share information over physical or wireless connections.

Before contemporary networking practices, engineers would have to physically move computers to share data between devices, which was an unpleasant task at a time when computers were large and unwieldy. To simplify the process (especially for government workers), the Department of Defense funded the creation of the first functioning computer network (eventually named ARPANET) in the late 1960s.

Since then, networking practices—and the computer systems that drive them—have evolved tremendously. Today’s computer networks facilitate large-scale inter-device communication for every business, entertainment and research purpose. The internet, online search, email, audio and video sharing, online commerce, live-streaming and social media all exist because of advancements in computer networking.

Discover how forward-thinking IT leaders are using AI and automation to drive competitiveness with autonomous IT operations.

Before we delve into more complex networking topics, it’s important to understand fundamental networking components, including:

  • IP address: An IP address is the unique number assigned to every network device in an Internet Protocol (IP) network; each IP address identifies the device’s host network and its location on the network. When one device sends data to another, the data includes a “header” that includes the IP addresses of both the sending and receiving devices.
  • Nodes: A node is a network connection point that can receive, send, create or store data. It’s essentially any network device—computers, printers, modems, bridges or switches—that can recognize, process and transmit information to another network node. Each node requires some form of identification (such an IP or MAC address) to receive access to the network.
  • Routers: A router is a physical or virtual device that sends data “packets” between networks. Routers analyze the data within packets to determine the best transmission path and use sophisticated routing algorithms to forward data packets until they reach their destination node.

Switches: A switch is a device that connects network devices and manages node-to-node communication across a network, making sure that data packets reach their intended destination. Unlike routers, which send information between networks, switches send information between nodes within a network.

Consequently, “switching” refers to how data is transferred between devices on a network. Networks rely on three main types of switching:

Circuit switching establishes a dedicated data communication path between nodes in a network, so no other traffic can traverse the same path. Circuit switching sees to it that full bandwidth is available during every transmission.

Message switching sends whole messages from the source node to the destination node, with the message traveling from switch to switch until it reaches the destination.

Packet switching involves breaking down data into independent components to make data transmission less demanding of network resources. With packet switching, packets—instead of entire data streams—travel through the network to their end destination.

  • Ports: A port indicates a specific connection between network devices, with each port identified by a number. If an IP address is analogous to a hotel address, then ports are the suites and room numbers. Computers use port numbers to determine which application, service or process should receive which messages.
  • Gateways: Gateways are hardware devices that facilitate communication between two different networks. Routers, firewalls and other gateway devices use rate converters, protocol translators and other technologies to make inter-network communication possible between otherwise incompatible devices.

Typically, computer networks are defined by geographical area. A local area network (LAN) connects computers in a defined physical space, while a  wide area network  ( WAN ) can connect computers across continents. However, networks are also defined by the protocols they use to communicate, the physical arrangement of their components, how they manage network traffic and the purpose they serve in their respective environments.

Here, we’ll discuss the most common and widely used computer network types in three broad categories.

The network types in this category are distinguished by the geographical area the network covers.

A LAN connects computers over a relatively short distance, such as those within an office building, school or hospital. LANs are typically privately owned and managed.

As the name implies, a WAN connects computers across large geographical areas, such as regions and continents. WANs often have collective or distributed ownership models for network management purposes.  Cloud networks serve as one example, since they’re hosted and delivered by public and private cloud infrastructures across the globe. A software-defined wide area network (SD-WAN) is a virtualized WAN architecture that uses SDN principles to centralize the management of disconnected WAN networks and optimize network performance.

MANs are larger than LANs but smaller than WANs. Cities and government entities typically own and manage MANs.

A PAN serves one person. If a user has multiple devices from the same manufacturer (an iPhone and a MacBook, for instance), it’s likely they've set up a PAN that shares and syncs content—text messages, emails, photos and more—across devices.

Network nodes can send and receive messages using either wired or wireless links (connections).

Wired network devices are connected by physical wires and cables, including copper wires and Ethernet, twisted pair, coaxial or fiber optic cables. Network size and speed requirements typically dictate the choice of cable, the arrangement of network elements and the physical distance between devices.

Wireless networks forgo cables for infrared, radio or electromagnetic wave transmission across wireless devices with built-in antennae and sensors.

Computing networks can transmit data using a range of transmission dynamics, including: 

In a multipoint network, multiple devices share channel capacity and network links.

Network devices establish a direct node-to-node link to transmit data.

On broadcast networks, several interested “parties” (devices) can receive one-way transmissions from a single sending device. Television stations are a great example of broadcast networks.

A VPN is a secure, point-to-point connection between two network endpoints. It establishes an encrypted channel that keeps a user’s identity and access credentials, as well as any data transferred, inaccessible to hackers.

Computer network architecture establishes the theoretical framework of a computer network, including design principles and communications protocols.

Primary types of network architectures

  • Peer-to-peer (P2P) architectures: In a P2P architecture, two or more computers are connected as “peers,” meaning they have equal power and privileges on the network. A P2P network doesn’t require a central server for coordination. Instead, each computer on the network acts as both a client (a computer that needs to access a service) and a server (a computer that provides services to clients). Every peer on the network makes some of its resources available to other network devices, sharing storage, memory, bandwidth and processing power across the network.
  • Client-server architectures: In a client-server network, a central server (or group of servers) manages resources and delivers services to client devices on the network; clients in this architecture don’t share their resources and only interact through the server. Client-server architectures are often called tiered architectures because of their multiple layers.
  • Hybrid architectures: Hybrid architectures incorporate elements of both the P2P and client-server models.

Whereas architecture represents the theoretical framework of a network, topology is the practical implementation of the architectural framework. Network topology describes the physical and logical arrangement of nodes and links on a network, including all hardware (routers, switches, cables), software (apps and operating systems) and transmission media (wired or wireless connections).

Common network topologies include bus, ring, star and mesh.

In a bus network topology , every network node is directly connected to a main cable. In a ring topology , nodes are connected in a loop, so each device has exactly two neighbors. Adjacent pairs are connected directly and nonadjacent pairs are connected indirectly through intermediary nodes.  Star network topologies feature a single, central hub through which all nodes are indirectly connected.

Mesh topologies are a bit more complex, defined by overlapping connections between nodes. There are two types of mesh networks— full mesh and partial mesh . In a full mesh topology , every network node connects to every other network node, providing the highest level of network resilience. In a partial mesh topology , only some network nodes connect, typically those that exchange data most frequently.

Full mesh topologies can be expensive and time-consuming to run, which is why they’re often reserved for networks that require high redundancy. Partial mesh, on the other hand, provides less redundancy but is more cost-effective and simpler to run.

Regardless of subtype, mesh networks have self-configuration and self-organization capabilities; they automate the routing process, so the network always finds the fastest, most reliable data path.

Whether it’s the internet protocol (IP) suite, Ethernet, wireless LAN (WLAN) or cellular communication standards, all computer networks follow communication protocols—sets of rules that every node on the network must follow in order to share and receive data. Protocols also rely on gateways to enable incompatible devices to communicate (a Windows computer attempting to access Linux servers, for instance)

Many modern networks run on TCP/IP models, which include four network layers.

  • Network access layer.  Also called the data link layer or the physical layer, the network access layer of a TCP/IP network includes the network infrastructure (hardware and software components) necessary for interfacing with the network medium. It handles physical data transmission—using Ethernet and protocols such as the address resolution protocol (ARP)—between devices on the same network.
  • Internet layer. The internet layer is responsible for logical addressing, routing and packet forwarding. It primarily relies on the IP protocol and the Internet Control Message Protocol (ICMP), which manages addressing and routing of packets across different networks.
  • Transport layer. The TCP/IP transport layer enables data transfer between upper and lower layers of the network. Using TCP and UDP protocols, it also provides mechanisms for error checking and flow control. TCP is a connection-based protocol that is generally slower but more reliable than UDP. UDP is a connectionless protocol that is faster than TCP but does not provide guaranteed transfer. UDP protocols facilitate packet transmission for time-sensitive apps (such as video streaming and gaming platforms) and DNS lookups.

Application layer. TCP/IP’s application layer uses HTTP, FTP, Post Office Protocol 3 (POP3), SMTP, domain name system (DNS) and SSH protocols to provide network services directly to applications. It also manages all the protocols that support user applications. 

Though TCP/IP is more directly applicable to networking, the Open Systems Interconnection (OSI) model —sometimes called the OSI reference model—has also had a substantial impact on computer networking and computer science, writ broadly.

OSI is a conceptual model that divides network communication into seven abstract layers (instead of four), providing a theoretical underpinning that helps engineers and developers understand the intricacies of network communication. The OSI model's primary value lies in its educational utility and its role as a conceptual framework for designing new protocols, making sure that they can interoperate with existing systems and technologies.

However, the TCP/IP model's practical focus and real-world applicability have made it the backbone of modern networking. Its robust, scalable design and horizontal layering approach has driven the explosive growth of the internet, accommodating billions of devices and massive amounts of data traffic.

Using email as an example, let’s walk through an example of how data moves through a network.

If a user wants to send an email, they first write the email and then press the “send” button.  When the user presses “send,” an SMTP or POP3 protocol uses the sender’s wifi to direct the message from the sender node and through the network switches, where it’s compressed and broken down into smaller and smaller segments (and ultimately into bits, or strings of 1s and 0s).

Network gateways direct the bit stream to the recipient’s network, converting data and communication protocols as needed. When the bit stream reaches the recipient’s computer, the same protocols direct the email data through the network switches on the receiver’s network. In the process, the network reconstructs the original message until the email arrives, in human-readable form, in the recipient’s inbox (the receiver node).

Computer networks are inescapable, present in many aspects of modern life. In business, relying on computer networks isn’t an option—they are fundamental to the operation of modern enterprises.

Computer networks provide numerous benefits, including:

Networking enables every form of digital communication, including email, messaging, file sharing, video calls and streaming. Networking connects all the servers, interfaces and transmission media that make business communication possible.

Without networking, organizations would have to store data in individual data repositories, which is unsustainable in the age of  big data.  Computer networks help teams keep centralized data stores that serve the entire network, freeing up valuable storage capacity for other tasks.

Users, network administrators and developers alike stand to benefit from how networking simplifies resource and knowledge sharing. Networked data is easier to request and fetch, so users and clients get faster responses from network devices. And for those on the business side, networked data makes it easier for teams to collaborate and share information as technologies and enterprises evolve.

Not only are well-built networking solutions more resilient, but they also offer businesses more options for  cybersecurity  and  network security . Most network providers offer built-in encryption protocols and access controls (such as  multifactor authentication ) to protect sensitive data and keep bad actors off the network.

Modern network infrastructures built for digital transformation require solutions that can be just as dynamic, flexible and scalable as the new environments. IBM® SevOne® provides application-centric network observability to help NetOps spot, address and prevent network performance issues in hybrid environments. 

IBM NS1 Connect® provides fast, secure connections to users anywhere in the world with premium DNS and advanced, customizable traffic steering. Always-on, API-first architecture enables your IT teams to more efficiently monitor networks, deploy changes and conduct routine maintenance.

IBM Cloud Pak® for Network Automation is an intelligent cloud platform that enables the automation and orchestration of network operations so CSPs and MSPs can transform their networks, evolve to zero-touch operations, reduce OPEX and deliver services faster.

IBM Hybrid Cloud Mesh, a multicloud networking solution, is a SaaS product designed to enable organizations to establish simple and secure application-centric connectivity across a wide variety of public and private cloud, edge and on-premises environments.

Cloud networking solutions can help your organization implement a secure, highly available global network. Working with an experienced network service provider, you can design and build the unique configuration that enables you to optimize network traffic flow, protect and support applications and meet your specific business needs.

A content delivery network (CDN) is a network of servers that is geographically dispersed to enable faster web performance by locating copies of web content closer to users or facilitating delivery of dynamic content.

Explore the differences between these two approaches to storage and file sharing.

Network monitoring means using network monitoring software to monitor a computer network’s ongoing health and reliability.

NetFlow, a network protocol developed for Cisco routers by Cisco Systems, is widely used to collect metadata about the IP traffic flowing across network devices such as routers, switches and hosts.

Software-defined networking (SDN) is a software-controlled approach to networking architecture driven by application programming interfaces (APIs).

Middleware is software that enables one or more kinds of communication or connectivity between applications or components in a distributed network.

IBM NS1 Connect provides fast, secure connections to users anywhere in the world with premium DNS and advanced, customizable traffic steering. NS1 Connect’s always-on, API-first architecture enables your IT teams to more efficiently monitor networks, deploy changes and conduct routine maintenance.

computer network Recently Published Documents

Total documents.

  • Latest Documents
  • Most Cited Documents
  • Contributed Authors
  • Related Sources
  • Related Keywords

Big Data Security Management Countermeasures in the Prevention and Control of Computer Network Crime

This paper aims to study the Countermeasures of big data security management in the prevention and control of computer network crime in the absence of relevant legislation and judicial practice. Starting from the concepts and definitions of computer crime and network crime, this paper puts forward the comparison matrix, investigation and statistics method and characteristic measure of computer crime. Through the methods of crime scene investigation, network investigation and network tracking, this paper studies the big data security management countermeasures in the prevention and control of computer network crime from the perspective of criminology. The experimental results show that the phenomenon of low age is serious, and the number of Teenagers Participating in network crime is on the rise. In all kinds of cases, criminals under the age of 35 account for more than 50%.

Mathematical Models of Effective Topology of Computer Networks for Electric Power Supply Control on Railway Transport

The paper is devoted to analysis of modern directions of innovation-investment formation of intelligent computer networks that control the fast-moving technological processes of electricity supply. It is based on the conclusion that the problem of increasing the productivity of information exchange between information resources and consumers is dominant. A method for increasing the efficiency of information exchange is proposed as a search for the rational location of a new node and the organization of such a set of its connections among the whole set of nodes of the computer network, which provides a minimum average topological distance. Mathematical models of effective topological organization of connections in computer network of power consumption control at the level of traction substations, electric power distances and the railway in general are proposed.

AI-assisted Computer Network Operations testbed for Nature-Inspired Cyber Security based adaptive defense simulation and analysis

D-ram distribution: a popular energy-saving memory mining blockchain technology.

<p>Cryptocurrencies have been a massive surge since last decade leads to blockchain technology steps in epical milestone. Blockchain is a distributed database that is shared among the nodes of a computer network. Bitcoin, as a critical role in cryptocurrency system is known as a typical blockchain, maintaining a secure and decentralized record of transactions. The value and advantage of blockchain technology has been acknowledged by the public and applied in difference areas. The defect of blockchain technology is gradually revealed. Excessively wasting electricity have severely affect the utilities usage in many countries. Therefore, DRD Memory Mining is developed to solve the defects. This article illustrates the algorithm of DRD Memory Mining and how the memory consensus mechanism works.</p>

Computer Network Attack Detection Using Enhanced Clustering Technologies

The need for security means has brought from the fact of privacy of data especially after the communication revolution in the recent times. The advancement of data mining and machine learning technology has paved the road for establishment an efficient attack prediction paradigm for protecting of large scaled networks. In this project, computer network intrusions had been eliminated by using smart machine learning algorithm. Referring a big dataset named as KDD computer intrusion dataset which includes large number of connections that diagnosed with several types of attacks; the model is established for predicting the type of attack by learning through this data. Feed forward neural network model is outperformed over the other proposed clustering models in attack prediction accuracy.

Research and Implementation of Recommendation Algorithm in Current Computer Network

Computer network security protection strategy based on big data, exploration of constructing a new mode of evaluation of tcm surgery based on computer network platform, computer network security defense model.

Abstract With the rapid development of the Internet industry, hundreds of millions of online resources are also booming. In the information space with huge and complex resources, it is necessary to quickly help users find the resources they are interested in and save users time. At this stage, the content industry’s application of the recommendation model in the content distribution process has become the mainstream. The content recommendation model provides users with a highly efficient and highly satisfying reading experience, and solves the problem of information redundancy to a certain extent. Knowledge tag personalized dynamic recommendation technology is currently widely used in the field of e-commerce. The purpose of this article is to study the optimization of the knowledge tag personalized dynamic recommendation system based on artificial intelligence algorithms. This article first proposes a hybrid recommendation algorithm based on the comparison between content-based filtering and collaborative filtering algorithms. It mainly introduces user browsing behavior analysis and design, KNN-based item similarity algorithm design, and hybrid recommendation algorithm implementation. Finally, through algorithm simulation experiments, the effectiveness of the algorithm in this paper is verified, and the accuracy of the recommendation has been improved.

Export Citation Format

Share document.

research on computer network

  • How it works

researchprospect post subheader

Useful Links

How much will your dissertation cost?

Have an expert academic write your dissertation paper!

Dissertation Services

Dissertation Services

Get unlimited topic ideas and a dissertation plan for just £45.00

Order topics and plan

Order topics and plan

Get 1 free topic in your area of study with aim and justification

Yes I want the free topic

Yes I want the free topic

Computer Networking Dissertation Topics

Published by Carmen Troy at January 5th, 2023 , Revised On May 16, 2024

A dissertation is an essential aspect of completing your degree program. Whether you are pursuing your master’s or are enrolled in a PhD program, you will not be awarded a degree without successfully submitting a thesis. To ensure that your thesis is submitted successfully without any hindrances, you should first get your topic and dissertation outline approved by your professor. When approving, supervisors focus on a lot of aspects.

However, relevance, recency, and conciseness play a huge role in accepting or rejecting your topic.

As a computer networking student, you have a variety of networking topics to choose from. With the field evolving with each passing day, you must ensure that your thesis covers recent computer networking topics and explores a relevant problem or issue. To help you choose the right topic for your dissertation, here is a list of recent and relevant computer networking dissertation topics.

List Of Trending Ideas For Your Computer Networking Dissertation

  • Machine learning for proactive network anomaly detection 
  • The role of software-defined-networking (SDN) for network performance and security 
  • Applications and challenges of 6G technologies 
  • How to ensure fairness and efficiency in Multi-Access Edge Computing (MEC)
  • Denial-of-Service (DoS) Attacks in the Age of Distributed Denial-of-Service (DDoS) Attacks
  • Applications and rise of Low-Power Wide Area Networks (LPWANs)
  • Efficient Resource Allocation and Quality-of-Service (QoS) Management
  • Ethical Implications of Artificial Intelligence (AI) in Network Management
  • The best ways to use Blockchain for Tamper-Proof Evidence Collection and Storage
  • Role of Network Operators in Cloud Gaming

Computer Networking Dissertation Topics For Your Research

Topic 1: an evaluation of the network security during machine to machine communication in iot.

Research Aim: The research aims to evaluate the network security issues associated with M2M communication in IoT.

 Objectives:

  • To evaluate the factors affecting the network security of IoT devices.
  • To determine the methods for increasing data integrity in M2M communication against physical tampering and unauthorised monitoring.
  • To evaluate the network security issues associated with M2M communication in IoT and offer suitable recommendations for improvement.

Topic 2: An analysis of the cybersecurity challenges in public clouds and appropriate intrusion detection mechanisms.

Research Aim: The aim of the research is to analyse the cybersecurity challenges in public clouds and the appropriate intrusion detection mechanisms.

Objectives:

  • To analyse the types of cybersecurity threats impacting public clouds.
  • To determine some of the competent intrusion detection techniques that can be used in cloud computing.
  • To investigate the cybersecurity challenges in public clouds and offer mitigating with appropriate intrusion detection techniques.

Topic 3: Investigating the impact of SaaS cloud ERP on the scalability and cost-effectiveness of business.

Research Aim: The research aims to investigate the impact of SaaS cloud ERP on the scalability and cost-effectiveness of business.

  • To analyse the benefits of SaaS ERP over traditional ERP.
  • To evaluate the characteristics of SaaS architecture in cloud computing and determine its varieties.
  • To investigate how SaaS cloud ERP impacts business scalability and cost-effectiveness.

Topic 4: An evaluation of the requirements of cloud repatriation and the challenges associated with it.

Research Aim: The research aims to evaluate the requirements of cloud repatriation in organisations and the associated challenges

  • To analyse the key factors of cloud repatriation.
  • To determine the challenges associated with cloud repatriation from public clouds.
  • To evaluate the need for cloud repatriation in organisations and the associated complexities

Topic 5: An examination of the security mechanisms in decentralised networks and the ways of enhancing system robustness

Research Aim: The research aims to investigate the security mechanisms in decentralised networks and the ways of enhancing system robustness.

  • To analyse the concept of decentralised networks and understand their difference from centralised networks.
  • To analyse the security mechanisms in decentralised networks to determine how it offers visibility and traceability.
  • To investigate the security mechanisms in decentralised networks and how system robustness can be increased for better privacy and security.

Latest Computer Networking Dissertation Topics

Exploring the importance of computer networking in today’s era.

Research Aim: Even though computer networking has been practised for a few years now, its importance has increased immensely over the past two years. A few main reasons include the use of technology by almost every business and the aim to offer customers an easy and convenient shopping experience. The main aim of this research will be to explain the concepts of computer networking, its benefits, and its importance in the current era. The research will also discuss how computer networking has helped businesses and individuals perform their work and benefit from it. The research will then specifically state examples where computer networking has brought positive changes and helped people achieve what they want.

Wireless Networks in Business Settings – An Analysis

Research Aim: Wireless networks are crucial in computer networking. They help build networks seamlessly, and once the networks are set up on a wireless network, it becomes extremely easy for the business to perform its daily activities. This research will investigate all about wireless networks in a business setting. It will first introduce the various wireless networks that can be utilised by a business and will then talk about how these networks help companies build their workflow around them. The study will analyse different wireless networks used by businesses and will conclude how beneficial they are and how they are helping the business.

Understanding Virtual Private Networks – A Deep Analysis of Their Challenges

Research Aim: Private virtual networks (VPN) are extremely common today. These are used by businesses and individuals alike. This research aims to understand how these networks operate and how they help businesses build strong and successful systems and address the challenges of VPNs. A lot of businesses do not adopt virtual private networks due to the challenges that they bring. This research will address these challenges in a way that will help businesses implement VPNs successfully.

A Survey of the Application of Wireless Sensor Networks

Research Aim: Wireless sensor networks are self-configured, infrastructure-less wireless networks to pass data. These networks are now extremely popular amongst businesses because they can solve problems in various application domains and possess the capacity to change the way work is done. This research will investigate where wireless sensor networks are implemented, how they are being used, and how they are performing. The research will also investigate how businesses implement these systems and consider factors when utilising these wireless sensor networks.

Computer Network Security Attacks – Systems and Methods to Respond

Research Aim: With the advent of technology today, computer networks are extremely prone to security attacks. A lot of networks have security systems in place. However, people with nefarious intent find one way to intrude and steal data/information. This research will address major security attacks that have impacted businesses and will aim to address this challenge. Various methods and systems will be highlighted to protect the computer networks. In addition to this, the research will also discuss various methods to respond to attacks and to keep the business network protected.

Preventing a Cyberattack – How Can You Build a Powerful Computer Network?

Research Aim: Cyberattacks are extremely common these days. No matter how powerful your network is, you might be a victim of phishing or hacking. The main aim of this research will be to outline how a powerful computer network can be built. Various methods to build a safe computer network that can keep data and information will be outlined, and the study will also highlight ways to prevent a cyberattack. In addition to this, the research will talk about the steps that should be taken to keep the computer network safe. The research will conclude with the best way and system to build a powerful and safe computer network.

Types of Computer Networks: A Comparison and Analysis

Research Aim: There are different types of computer networks, including LAN, WAN, PAN, MAN, CAN, SAN, etc. This research will discuss all the various types of computer networks to help readers understand how all these networks work. The study will then compare the different types of networks and analyse how each of them is implemented in different settings. The dissertation will also discuss the type of computer networks that businesses should use and how they can use them for their success. The study will then conclude which computer network is the best and how it can benefit when implemented.

Detecting Computer Network Attacks by Signatures and Fast Content Analysis

Research Aim: With technological advancement, today, many computer network attacks can be detected beforehand. While many techniques are utilised for detecting these attacks, the use of signatures and fast content analysis are the most popular ones. This research will explore these techniques in detail and help understand how they can detect a computer network attack and prevent it. The research will present different ways these techniques are utilised to detect an attack and help build powerful and safe computer networks. The research will then conclude how helpful these two techniques are and whether businesses should implement them.

Overview of Wireless Network Technologies and their Role in Healthcare

Research Aim: Wireless network technologies are utilised by several industries. Their uses and benefits have helped businesses resolve many business problems and assisted them in conducting their daily activities without any hindrance. This networking topic will help explore how wireless network technologies work and will talk about their benefits. This research aims to find out how wireless technologies help businesses carry out their daily routine tasks effortlessly. For this research, the focus will be on the healthcare industry. The study will investigate how wireless network technology has helped the healthcare sector and how it has benefited them to perform their daily tasks without much effort.

Setting up a Business Communication System over a Computer Network

Research Aim: Communication is an essential aspect of every business. Employees need to communicate effectively to keep the business going. In the absence of effective communication, businesses suffer a lot as the departments are not synchronised, and the operations are haphazard. This research will explore the different ways through which network technologies help conduct smooth and effective communication within organisations. This research will conclude how wireless networks have helped businesses build effective communication systems within their organisation and how they have benefited from it. It will then conclude how businesses have improved and solved major business problems with the help of these systems.

Free Dissertation Topic

Phone Number

Academic Level Select Academic Level Undergraduate Graduate PHD

Academic Subject

Area of Research

Frequently Asked Questions

How to find computer networking dissertation topics.

To find computer networking dissertation topics:

  • Follow industry news and emerging technologies.
  • Investigate unresolved networking challenges.
  • Review recent research papers.
  • Explore IoT, cybersecurity , and cloud computing.
  • Consider real-world applications.
  • Select a topic aligned with your expertise and career aspirations.

You May Also Like

Need interesting and manageable Cryptocurrency – Bitcoin, Etherum & Ripple on dissertation topics or thesis? Here are the trending Cryptocurrency – Bitcoin, Etherum & Ripple dissertation titles so you can choose the most suitable one.

Medical law becomes increasingly important as healthcare dominates as a social issue. Graduate students must select a thesis subject as part of their programs. The subject you choose must have sufficient data to support your thesis.

Need interesting and manageable Sexual Harassment of Women dissertation topics? Here are the trending Sexual Harassment of Women dissertation titles so you can choose the most suitable one.

USEFUL LINKS

LEARNING RESOURCES

researchprospect-reviews-trust-site

COMPANY DETAILS

Research-Prospect-Writing-Service

  • How It Works

UCF Researcher Clearing the Way for Smart Wireless Networks

research on computer network

Communicating unimpeded at distances near and far is a dream Murat Yuksel is hoping to realize.

His ongoing research, titled “INWADE: INtelligent Waveform Adaptation with DEep Learning,” and funded by the U.S. Air Force Research, aims to get us closer to that dream by improving the quality of high frequency wireless networks using machine learning to fine-tune the networks’ efficacy.

The need to efficiently improve wireless signal quality will grow with the continuing proliferation of wireless networks for use in communications, says Yuksel, who is a UCF Department of Electrical and Computer Engineering professor within the College of Engineering and Computer Science.

“The emerging 5G-and-beyond wireless networks regularly use high frequency signals that are very sensitive to the environment,” he says. “They get blocked easily or attenuate quickly as they travel. Even the nature of the particles in the air affects them significantly. Deep learning enables us to learn the features of the environment. Hence, using these learned features enables us to better tune the wireless signals to attain higher data transfer rates.”

INWADE is an automated means to design multiple communication blocks at the transmitter and the receiver jointly by training them as a combination of deep neural networks, benefitting wireless network users.

The development and study of the INWADE network was catalyzed by the need to keep pace with the spread and usage of wireless networks.

“Demand for wireless data transfers (such as cellular and Wi-Fi) is ever-increasing and this causes more tussle on the sharing of the underlying natural resource, which is the radio spectrum that supports these wireless transfers,” Yuksel says.

The deep learning aspect of the research is an emerging consideration delivering better wireless signals with minimal delay. The deep learning network will select the optimal waveform modifications and beam direction with its perceived radio frequency environment to manage drones and nodes providing wireless signals and modifications.

“Our work shows the feasibility of using deep reinforcement learning in real time to fine tune millimeter-wave signals, which operate in part of the super-6 GHz bands,” Yuksel says. “Further, the project aims to show that deep learning at the link level as well as network level can work together to make the signals ‘deep smart.’”

Harnessing existing wireless networking resources and navigating fixed obstacles or crowded airways quickly is an omnipresent concern and leads network managers to search for spectra at higher frequencies than the commonly used sub-6 GHz frequency bands, Yuksel says.

These “super-6 GHz” bands are difficult to access and maintain, so deep learning is something Yuksel is hoping to use to address that challenge.

“They operate with highly directional antennas, which makes them brittle to mobility/movement and they cannot reach far as they are sensitive to particles in the air,” Yuksel says. “Hence, we have to handle them very carefully to support high-throughput data transfers. This requires advanced algorithmic methods that can learn the environment and tune the super-6 GHz wireless signals to the operational environment.”

Some initial findings regarding the viability of algorithms that may be implemented in INWADE were published at the International Federation for Information Processing Internet of Things Conference in late 2023.

The project started earlier in 2024 after receiving the first portion of the awarded $250,000 from the Air Force Research Laboratory in late 2023, but there already are promising findings, Yuksel notes.

“We have shown in a lab testbed that our deep learning methods can successfully solve some of the fundamental link level problems, such as angle-of-arrival detection or finding the direction of an incoming signal,” he says. “This capability is very useful for several critical applications, for example determining location and movement direction. Next steps include demonstrating similar capability on drones and showing the feasibility of co-existence of deep learning at the link and network levels.”

After developing and testing the INWADE framework, Yuksel foresees additional challenges and considerations that may require further study when implementing machine learning.

“A key theoretical endeavor is to understand if multiple machine learning agents can co-exist at different layers of the wireless network and still attain improved performance without jeopardizing each other’s goals,” he says.

Although Yuksel is the principal investigator for the research, he credits his students and collaborators for much of his success.

“My students help in performing the experiments and gathering results,” he says. “I am indebted to them. We are also collaborating with Clemson as they are working on designing new machine learning methods for the problems we are tackling.”

Yuksel’s work continues, and he is optimistic that his research will further benefit the greater scientific endeavor of making wireless networks accessible for all.

“The potential for this effort is huge,” he says. “I consider the radio spectrum to be a critical natural resource, like water or clean air. As machine learning methods are advancing, being able to use them for better sharing the spectrum and solving critical wireless challenges is very much needed.”

Distribution A. Approved for public release: Distribution Unlimited: AFRL-2024-2894 on 17 Jun 2024

Researcher’s Credentials

Yuksel is a professor at UCF’s Department of Electrical and Computer Engineering and served as its interim chair from 2021 to 2022. He received his doctoral degree in computer science from Rensselaer Polytechnic Institute in 2002. Yuksel’s research interests include wireless systems, optical wireless, and network management, and he has multiple ongoing research projects funded by the National Science Foundation.

Story from UCF Researcher Clearing the Way for Smart Wireless Networks by Edward Duryea for UCF Today

Recent News

  • July 2024  (1)
  • June 2024  (9)
  • May 2024  (10)
  • April 2024  (9)
  • March 2024  (8)
  • February 2024  (6)
  • January 2024  (2)
  • December 2023  (4)
  • November 2023  (10)
  • October 2023  (5)
  • September 2023  (4)
  • August 2023  (2)

AI Security

Google Introduces Project Naptime for AI-Powered Vulnerability Research

AI-Powered Vulnerability Research

Google has developed a new framework called Project Naptime that it says enables a large language model (LLM) to carry out vulnerability research with an aim to improve automated discovery approaches.

"The Naptime architecture is centered around the interaction between an AI agent and a target codebase," Google Project Zero researchers Sergei Glazunov and Mark Brand said . "The agent is provided with a set of specialized tools designed to mimic the workflow of a human security researcher."

The initiative is so named for the fact that it allows humans to "take regular naps" while it assists with vulnerability research and automating variant analysis.

The approach, at its core, seeks to take advantage of advances in code comprehension and general reasoning ability of LLMs, thus allowing them to replicate human behavior when it comes to identifying and demonstrating security vulnerabilities.

Cybersecurity

It encompasses several components such as a Code Browser tool that enables the AI agent to navigate through the target codebase, a Python tool to run Python scripts in a sandboxed environment for fuzzing, a Debugger tool to observe program behavior with different inputs, and a Reporter tool to monitor the progress of a task.

AI-Powered Vulnerability Research

Google said Naptime is also model-agnostic and backend-agnostic, not to mention be better at flagging buffer overflow and advanced memory corruption flaws, according to CYBERSECEVAL 2 benchmarks. CYBERSECEVAL 2, released earlier this April by researchers from Meta, is an evaluation suite to quantify LLM security risks.

In tests carried out by the search giant to reproduce and exploit the flaws, the two vulnerability categories achieved new top scores of 1.00 and 0.76, up from 0.05 and 0.24, respectively for OpenAI GPT-4 Turbo.

"Naptime enables an LLM to perform vulnerability research that closely mimics the iterative, hypothesis-driven approach of human security experts," the researchers said. "This architecture not only enhances the agent's ability to identify and analyze vulnerabilities but also ensures that the results are accurate and reproducible."

Browser Security

Continuous Attack Surface Discovery & Penetration Testing

Continuously discover, prioritize, & mitigate exposures with evidence-backed ASM, Pentesting, and Red Teaming.

Over 110,000 Websites Affected by Hijacked Polyfill Supply Chain Attack

Cybersecurity Webinars

Secure your digital identity with these 5 must-have itdr features.

Facing identity threats? Discover how ITDR can save you from lateral movement and ransomware attacks.

Why Compromised Credentials Are the #1 Cyber Threat in 2024

From data breaches to identity theft, compromised credentials can cost you everything. Learn how to stop attackers in their tracks.

Cybersecurity

Democratization of Cyberattacks: How Billions of Unskilled Would-be Hackers Can Now Attack Your Organization

Expert Insights

Survey Reveals Compliance Professionals Seek Quality, Efficiency, Trust & Partnership

Expert Insights

Securing SaaS Apps in the Era of Generative AI

Expert Insights

Patching vs. Isolating Vulnerabilities

Get the latest news, expert insights, exclusive resources, and strategies from industry leaders – all for free.

research on computer network

The Epistemological Consequences of Artificial Intelligence, Precision Medicine, and Implantable Brain-Computer Interfaces

Article sidebar.

research on computer network

Main Article Content

I argue that this examination and appreciation for the shift to abductive reasoning should be extended to the intersection of neuroscience and novel brain-computer interfaces too. This paper highlights the implications of applying abductive reasoning to personalized implantable neurotechnologies. Then, it explores whether abductive reasoning is sufficient to justify insurance coverage for devices absent widespread clinical trials, which are better applied to one-size-fits-all treatments. 

INTRODUCTION

In contrast to the classic model of randomized-control trials, often with a large number of subjects enrolled, precision medicine attempts to optimize therapeutic outcomes by focusing on the individual. [i] A recent publication highlights the strengths and weakness of both traditional evidence-based medicine and precision medicine. [ii] Plus, it outlines a tension in the shift from evidence-based medicine’s inductive reasoning style (the collection of data to postulate general theories) to precision medicine’s abductive reasoning style (the generation of an idea from the limited data available). [iii] The paper’s main example is the application of precision medicine for the treatment of cancer. [iv] I argue that this examination and appreciation for the shift to abductive reasoning should be extended to the intersection of neuroscience and novel brain-computer interfaces too.

As the name suggests, brain-computer interfaces are a significant advancement in neurotechnology that directly connects someone’s brain to external or implanted devices. [v] Among the various kinds of brain-computer interfaces, adaptive deep brain stimulation devices require numerous personalized adjustments to their settings during the implantation and computation stages in order to provide adequate relief to patients with treatment-resistant disorders. What makes these devices unique is how adaptive deep brain stimulation integrates a sensory component to initiate the stimulation. While not commonly at the level of sophistication as self-supervising or generative large language models, [vi] they currently allow for a semi-autonomous form of neuromodulation. This paper highlights the implications of applying abductive reasoning to personalized implantable neurotechnologies. Then, it explores whether abductive reasoning is sufficient to justify insurance coverage for devices absent widespread clinical trials, which are better applied to one-size-fits-all treatments. [vii]

I.     The State of Precision Medicine in Oncology and the Epistemological Shift

While a thorough overview of precision medicine for the treatment of cancer is beyond the scope of this article, its practice can be roughly summarized as identifying clinically significant characteristics a patient possesses (e.g., genetic traits) to land on a specialized treatment option that, theoretically, should benefit the patient the most. [viii] However, in such a practice of stratification patients fall into smaller and smaller populations and the quality of evidence that can be applied to anyone outside these decreases in turn. [ix] As inductive logic helps to articulate, the greater the number of patients that respond to a particular therapy the higher the probability of its efficacy. By straying from this logical framework, precision medicine opens the treatment of cancer to more uncertainty about the validity of these approaches to the resulting disease subcategories. [x] Thus, while contemporary medical practices explicitly describe some treatments as “personalized”, they ought not be viewed as inherently better founded than other therapies. [xi]

A relevant contemporary case of precision medicine out of Norway focuses on the care of a patient with cancer between the ventricles of the heart and esophagus, which had failed to respond to the standard regimen of therapies over four years. [xii] In a last-ditch effort, the patient elected to pay out-of-pocket for an experimental immunotherapy (nivolumab) at a private hospital. He experienced marked improvements and a reduction in the size of the tumor. Understandably, the patient tried to pursue further rounds of nivolumab at a public hospital. However, the hospital initially declined to pay for it given the “lack of evidence from randomised clinical trials for this drug relating to this [patient’s] condition.” [xiii] In rebuttal to this claim, the patient countered that he was actually similar to a subpopulation of patients who responded in “open‐label, single arm, phase 2 studies on another immune therapy drug” (pembrolizumab). [xiv] Given this interpretation of the prior studies and the patient’s response, further rounds of nivolumab were approved. Had the patient not had improvements in the tumor’s size following a round of nivolumab, then pembrolizumab’s prior empirical evidence in isolation would have been insufficient, inductively speaking, to justify his continued use of nivolumab. [xv]

The case demonstrates a shift in reasoning from the traditional induction to abduction . The phenomenon of ‘cancer improvement’ is considered causally linked to nivolumab and its underlying physiological mechanisms. [xvi] However, “the weakness of abductions is that there may always be some other better, unknown explanation for an effect. The patient may for example belong to a special subgroup that spontaneously improves, or the change may be a placebo effect. This does not mean, however, that abductive inferences cannot be strong or reasonable, in the sense that they can make a conclusion probable .” [xvii] To demonstrate the limitations of relying on the abductive standard in isolation, commentators have pointed out that side effects in precision medicine are hard to rule out as being related to the initial intervention itself unless trends from a group of patients are taken into consideration. [xviii]

As artificial intelligence (AI) assists the development of precision medicine for oncology, this uncertainty ought to be taken into consideration. The implementation of AI has been crucial to the development of precision medicine by providing a way to combine large patient datasets or a single patient with a large number of unique variables with machine learning to recommend matches based on statistics and probability of success upon which practitioners can base medical recommendations. [xix] The AI is usually not establishing a causal relationship [xx] – it is predicting. So, as AI bleeds into medical devices, like brain-computer interfaces, the same cautions about using abductive reasoning alone should be carried over.

II.     Responsive Neurostimulation, AI, and Personalized Medicine

Like precision medicine in cancer treatment, computer-brain interface technology similarly focuses on the individual patient through personalized settings. In order to properly expose the intersection of AI, precision medicine, abductive reasoning, and implantable neurotechnologies, the descriptions of adaptive deep brain stimulation systems need to deepen. [xxi] As a broad summary of adaptive deep brain stimulation, to provide a patient with the therapeutic stimulation, a neural signal, typically referred to as a local field potential, [xxii] must first be detected and then interpreted by the device. The main adaptive deep brain stimulation device with premarket approval, the NeuroPace Responsive Neurostimulation system, is used to treat epilepsy by detecting and storing “programmer-defined phenomena.” [xxiii] Providers can optimize the detection settings of the device to align with the patient’s unique electrographic seizures as well as personalize the reacting stimulation’s parameters. [xxiv] The provider adjusts the technology based on trial and error. One day machine learning algorithms will be able to regularly aid this process in myriad ways, such as by identifying the specific stimulation settings a patient may respond to ahead of time based on their electrophysiological signatures. [xxv] Either way, with AI or programmers, adaptive neurostimulation technologies are individualized and therefore operate in line with precision medicine rather than standard treatments based on large clinical trials.

Contemporary neurostimulation devices are not usually sophisticated enough to be prominent in AI discussions where the topics of neural networks, deep learning, generative models, and self-attention dominate the conversation. However, implantable high-density electrocorticography arrays (a much more sensitive version than adaptive deep brain stimulation systems use) have been used in combination with neural networks to help patients with neurologic deficits from a prior stroke “speak” through a virtual avatar. [xxvi] In some experimental situations, algorithms are optimizing stimulation parameters with increasing levels of independence. [xxvii] An example of neurostimulation that is analogous to the use of nivolumab in Norway surrounds a patient in the United States who was experiencing both treatment-resistant OCD and temporal lobe epilepsy. [xxviii] Given the refractory nature of her epilepsy, implantation of an adaptive deep brain stimulation system was indicated. As a form of experimental therapy, her treatment-resistant OCD was also indicated for the off-label use of an adaptive deep brain stimulation set-up. Another deep brain stimulation lead, other than the one implanted for epilepsy, was placed in the patient’s right nucleus accumbens and ventral pallidum region given the correlation these nuclei had with OCD symptoms in prior research. Following this, the patient underwent “1) ambulatory, patient-initiated magnet-swipe storage of data during moments of obsessive thoughts; (2) lab-based, naturalistic provocation of OCD-related distress (naturalistic provocation task); and (3) lab-based, VR [virtual reality] provocation of OCD-related distress (VR provocation task).” [xxix] Such signals were used to identify when to deliver the therapeutic stimulation in order to counter the OCD symptoms. Thankfully, following the procedure and calibration the patient exhibited marked improvements in their OCD symptoms and recently shared her results publicly. [xxx]

In both cases, there is a similar level of abductive justification for the efficacy of the delivered therapy. In the case study in which the patient was treated with adaptive deep brain stimulation, they at least had their neural activity tested in various settings to determine the optimum parameters for treatment to avoid them being based on guesswork. Additionally, the adaptive deep brain stimulation lead was already placed before the calibration trials were conducted, meaning that the patient had already taken on the bulk of the procedural risk before the efficacy could be determined. Such an efficacy test could have been replicated in the first patient’s cancer treatment, had it been biopsied and tested against the remaining immunotherapies in vitro . Yet, in the case of cancer with few options, one previous dose of a drug that appeared to work on the patient may justify further doses. However, as the Norwegian case presents, corroboration with known responses to a similar drug (from a clinical trial) could be helpful to validate the treatment strategy. (It should be noted that both patients were resigned to these last resort options regardless of the efficacy of treatment.)

There are some elements of inductive logic seen with adaptive deep brain stimulation research in general. For example, abductively the focus could be that patient X’s stimulation parameters are different from patient Y’s and patient Z’s. In contrast, when grouped as subjects who obtained personalized stimulation, patients X, Y, and Z demonstrate an inductive aspect to this approach’s safety and/or efficacy. The OCD case holds plenty of abductive characteristics in line with precision medicine’s approach to treating cancer and as more individuals try the method, there will be additional data. With the gradual integration of AI into brain-computer interfaces in the name of efficacy, this reliance on abduction will continue, if not grow, over time. Moving forward, if a responsive deep brain stimulation treatment is novel and individualized (like the dose of nivolumab) and there is some other suggestion of efficacy (like clinical similarities to other patients in the literature), then it may justify insurance coverage for the investigative intervention, absent other unrelated reasons to deny it.

III.     Ethical Implications and Next Steps

While AI’s use in oncology and neurology is not yet as prominent as its use in other fields (e.g., radiology), it appears to be on the horizon for both. [xxxi] AI can be found in both the functioning of the neurotechnologies as well as the implementation of precision medicine. The increasing use of AI may serve to further individualize both oncologic and neurological therapies. Given these implications and the handful of publications cited in this article, it is important to have a nuanced evaluation of how these treatments, which heavily rely on abductive justification, ought to be managed.

The just use an abductive approach may be difficult as AI infused precision medicine is further pursued. At baseline, such technology relies on a level of advanced technology literacy among the general public and could exclude populations who lack access to basic technological infrastructure or know-how from participation. [xxxii] Even among nations with adequate infrastructure, as more patients seek out implantable neurotechnologies, which require robust healthcare resources, the market will favor patient populations that can afford this complex care. [xxxiii]

If patients already have the means to pay for an initial dose/use of a precision medicine product out of pocket, should insurance providers be required to cover subsequent treatments? [xxxiv] That is, if a first dose of a cancer drug or a deep brain stimulator over its initial battery life is successful, patients may feel justified in having the costs of further treatments covered. The Norwegian patient’s experience implies there is a precedent for the idea that some public insurance companies ought to cover successful cancer therapies, however, insurance companies may not all see themselves as obligated to cover neurotechnologies that rely on personalized settings or that are based on precision/abductive research more than on clinical trials.

The fact that the cases outlined above rely on abductive style of reasoning implies that there may not be as strong a justification for coverage by insurance, as they are both experimental and individualized, when compared to the more traditional large clinical trials in which groups have the same or a standardized protocol (settings/doses). If a study is examining the efficacy of a treatment with a large cohort of patients or with different experimental groups/phases, insurance companies may conclude that the resulting symptom improvements are more likely to be coming from the devices themselves. A preference for inductive justification may take priority when ruling in favor of funding someone’s continued use of an implantable neurostimulator. There are further nuances to this discussion surrounding the classifications of these interventions as research versus clinical care that warrant future exploration, since such a distinction is more of a scale [xxxv] than binary and could have significant impacts on the “right-to-try” approach to experimental therapies in the United States. [xxxvi] Namely, given the inherent limitations of conducting large cohort trials for deep brain stimulation interventions on patients with neuropsychiatric disorders, surgically innovative frameworks that blend abductive and inductive methodologies, like with sham stimulation phases, have traditionally been used. [xxxvii] Similarly, for adaptive brain-computer interface systems, if there are no large clinical trials and instead only publications that demonstrate that something similar worked for someone else, then, in addition to the evidence that the first treatment/dose worked for the patient in question, the balance of reasoning would be valid and arguably justify insurance coverage. As precision approaches to neurotechnology become more common, frameworks for evaluating efficacy will be crucial both for insurance coverage and for clinical decision making.

ACKNOWLEDGEMENT

This article was originally written as an assignment for Dr. Francis Shen’s “Bioethics & AI” course at Harvard’s Center for Bioethics. I would like to thank Dr. Shen for his comments as well as my colleagues in the Lázaro-Muñoz Lab for their feedback.

[i] Jonathan Kimmelman and Ian Tannock, “The Paradox of Precision Medicine,” Nature Reviews. Clinical Oncology 15, no. 6 (June 2018): 341–42, https://doi.org/10.1038/s41571-018-0016-0.

[ii] Henrik Vogt and Bjørn Hofmann, “How Precision Medicine Changes Medical Epistemology: A Formative Case from Norway,” Journal of Evaluation in Clinical Practice 28, no. 6 (December 2022): 1205–12, https://doi.org/10.1111/jep.13649.

[iii] David Barrett and Ahtisham Younas, “Induction, Deduction and Abduction,” Evidence-Based Nursing 27, no. 1 (January 1, 2024): 6–7, https://doi.org/10.1136/ebnurs-2023-103873.

[iv] Vogt and Hofmann, “How Precision Medicine Changes Medical Epistemology,” 1208.

[v] Wireko Andrew Awuah et al., “Bridging Minds and Machines: The Recent Advances of Brain-Computer Interfaces in Neurological and Neurosurgical Applications,” World Neurosurgery , May 22, 2024, S1878-8750(24)00867-2, https://doi.org/10.1016/j.wneu.2024.05.104.

[vi] Mark Riedl, “A Very Gentle Introduction to Large Language Models without the Hype,” Medium (blog), May 25, 2023, https://mark-riedl.medium.com/a-very-gentle-introduction-to-large-language-models-without-the-hype-5f67941fa59e.

[vii] David E. Burdette and Barbara E. Swartz, “Chapter 4 - Responsive Neurostimulation,” in Neurostimulation for Epilepsy , ed. Vikram R. Rao (Academic Press, 2023), 97–132, https://doi.org/10.1016/B978-0-323-91702-5.00002-5.

[viii] Kimmelman and Tannock, 2018.

[ix] Kimmelman and Tannock, 2018.

[x] Simon Lohse, “Mapping Uncertainty in Precision Medicine: A Systematic Scoping Review,” Journal of Evaluation in Clinical Practice 29, no. 3 (April 2023): 554–64, https://doi.org/10.1111/jep.13789.

[xi] Kimmelman and Tannock, “The Paradox of Precision Medicine.”

[xii] Vogt and Hofmann, 1206.

[xiii] Vogt and Hofmann, 1206.

[xiv] Vogt and Hofmann, 1206.

[xv] Vogt and Hofmann, 1207.

[xvi] Vogt and Hofmann, 1207.

[xvii] Vogt and Hofmann, 1207.

[xviii] Vogt and Hofmann, 1210.

[xix] Mehar Sahu et al., “Chapter Three - Artificial Intelligence and Machine Learning in Precision Medicine: A Paradigm Shift in Big Data Analysis,” in Progress in Molecular Biology and Translational Science , ed. David B. Teplow, vol. 190, 1 vols., Precision Medicine (Academic Press, 2022), 57–100, https://doi.org/10.1016/bs.pmbts.2022.03.002.

[xx] Stefan Feuerriegel et al., “Causal Machine Learning for Predicting Treatment Outcomes,” Nature Medicine 30, no. 4 (April 2024): 958–68, https://doi.org/10.1038/s41591-024-02902-1.

[xxi] Sunderland Baker et al., “Ethical Considerations in Closed Loop Deep Brain Stimulation,” Deep Brain Stimulation 3 (October 1, 2023): 8–15, https://doi.org/10.1016/j.jdbs.2023.11.001.

[xxii] David Haslacher et al., “AI for Brain-Computer Interfaces,” 2024, 7, https://doi.org/10.1016/bs.dnb.2024.02.003.

[xxiii] Burdette and Swartz, “Chapter 4 - Responsive Neurostimulation,” 103–4; “Premarket Approval (PMA),” https://www.accessdata.fda.gov/scripts/cdrh/cfdocs/cfpma/pma.cfm?id=P100026.

[xxiv] Burdette and Swartz, “Chapter 4 - Responsive Neurostimulation,” 104.

[xxv] Burdette and Swartz, 126.

[xxvi] Sean L. Metzger et al., “A High-Performance Neuroprosthesis for Speech Decoding and Avatar Control,” Nature 620, no. 7976 (August 2023): 1037–46, https://doi.org/10.1038/s41586-023-06443-4.

[xxvii] Hao Fang and Yuxiao Yang, “Predictive Neuromodulation of Cingulo-Frontal Neural Dynamics in Major Depressive Disorder Using a Brain-Computer Interface System: A Simulation Study,” Frontiers in Computational Neuroscience 17 (March 6, 2023), https://doi.org/10.3389/fncom.2023.1119685; Mahsa Malekmohammadi et al., “Kinematic Adaptive Deep Brain Stimulation for Resting Tremor in Parkinson’s Disease,” Movement Disorders 31, no. 3 (2016): 426–28, https://doi.org/10.1002/mds.26482.

[xxviii] Young-Hoon Nho et al., “Responsive Deep Brain Stimulation Guided by Ventral Striatal Electrophysiology of Obsession Durably Ameliorates Compulsion,” Neuron 0, no. 0 (October 20, 2023), https://doi.org/10.1016/j.neuron.2023.09.034.

[xxix] Nho et al.

[xxx] Nho et al.; Erik Robinson, “Brain Implant at OHSU Successfully Controls Both Seizures and OCD,” OHSU News, accessed March 3, 2024, https://news.ohsu.edu/2023/10/25/brain-implant-at-ohsu-successfully-controls-both-seizures-and-ocd.

[xxxi] Awuah et al., “Bridging Minds and Machines”; Haslacher et al., “AI for Brain-Computer Interfaces.”

[xxxii] Awuah et al., “Bridging Minds and Machines.”

[xxxiii] Sara Green, Barbara Prainsack, and Maya Sabatello, “The Roots of (in)Equity in Precision Medicine: Gaps in the Discourse,” Personalized Medicine 21, no. 1 (January 2024): 5–9, https://doi.org/10.2217/pme-2023-0097.

[xxxiv] Green, Prainsack, and Sabatello, 7.

[xxxv] Robyn Bluhm and Kirstin Borgerson, “An Epistemic Argument for Research-Practice Integration in Medicine,” The Journal of Medicine and Philosophy: A Forum for Bioethics and Philosophy of Medicine 43, no. 4 (July 9, 2018): 469–84, https://doi.org/10.1093/jmp/jhy009.

[xxxvi] Vijay Mahant, “‘Right-to-Try’ Experimental Drugs: An Overview,” Journal of Translational Medicine 18 (June 23, 2020): 253, https://doi.org/10.1186/s12967-020-02427-4.

[xxxvii] Michael S. Okun et al., “Deep Brain Stimulation in the Internal Capsule and Nucleus Accumbens Region: Responses Observed during Active and Sham Programming,” Journal of Neurology, Neurosurgery & Psychiatry 78, no. 3 (March 1, 2007): 310–14, https://doi.org/10.1136/jnnp.2006.095315.

Ian Stevens

MA Philosophy University of Tasmania in Australia, MS Bioethics Harvard Medical School Center for Bioethics

Article Details

Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 International License .

IMAGES

  1. Top 5 Latest Computer Network Research Topics [Research Guidance]

    research on computer network

  2. Research Computer Networking Project Topics for Students

    research on computer network

  3. Latest Research Topics in Computer Networks (Help)

    research on computer network

  4. Current Research Projects on Computer Networks [Research Areas]

    research on computer network

  5. Computer Network PhD Research Topics [Professional Thesis Writers]

    research on computer network

  6. Latest Research Topics in Computer Networks for PhD

    research on computer network

VIDEO

  1. Liquid Research Computer Core freezedown

  2. Research Without Programming Theoretical Research in Computer Science

  3. What is the term for a computer's ability to identify objects, people, places, actions in images?

  4. Social networks, institutions, and the process of globalization

  5. Introduction to Computer Network

  6. Introduction to Internet Research

COMMENTS

  1. Computer Networks

    The International Journal of Computer and Telecommunications Networking. Computer Networks is an international, archival journal providing a publication vehicle for complete coverage of all topics of interest to those involved in the computer communications networking area. The audience includes researchers, managers and operators of networks as well as designers and implementors.

  2. Computer Networks

    Computer Networks. Computer networks allow computers to communicate with one another, and provide the fundamental infrastructures supporting our modern society. Research on computer networks at Yale improves on essential network system properties such as efficiency, robustness, and programmability. The research spans all networking layers ...

  3. Journal of Computer Networks and Communications

    Journal of Computer Networks and Communications publishes original research and review articles that investigate both the theoretical and practical aspects of computer ... Most Cited; Research Article. Open access. A Model for Predicting IoT User Behavior Based on Bayesian Learning and Neural Networks. Xin Xu, Chengning Huang, Yuquan Zhu, First ...

  4. Networking

    Our research combines building and deploying novel networking systems at unprecedented scale, with recent work focusing on fundamental questions around data center architecture, cloud virtual networking, and wide-area network interconnects. We helped pioneer the use of Software Defined Networking, the application of ML to networking, and the ...

  5. Full article: The past, present, and future of network monitoring: A

    Research in network monitoring, and more generally network science, spans multiple disciplines, including mathematics, statistics, physics, and computer science, to name a few. Describe what you recognize as advantages and disadvantages associated with the interdisciplinary nature of this area, particularly as they pertain to research in, and ...

  6. Machine Learning for Computer Systems and Networking: A Survey

    The research goals in computer systems include performance, energy efficiency, reliability, and security. In this survey, we focus on three fundamental problems in computer systems, each representing one level of the system abstractions: ... Computer networking. A computer network is an interconnection of multiple computing devices (a.k.a ...

  7. Computer Networks

    Proof of Learning (PoLe): Empowering neural network training with consensus building on blockchains. Yuan Liu, Yixiao Lan, Boyang Li, Chunyan Miao, Zhihong Tian. Article 108594. View PDF. Article preview. Read the latest articles of Computer Networks at ScienceDirect.com, Elsevier's leading platform of peer-reviewed scholarly literature.

  8. 376104 PDFs

    Explore the latest full-text research PDFs, articles, conference papers, preprints and more on COMPUTER NETWORKING. Find methods information, sources, references or conduct a literature review on ...

  9. Computer Networks

    Research Articles; Survey Article; Special Issue on Pub/sub Solutions for Interoperable and Dynamic IoT Systems; Special Issue on Selected papers from IFIP Networking Conference 2022; Special Issue on Machine Learning (ML) and Artificial Intelligence (AI) for the Internet of Things, 5G, and Beyond

  10. Advancements and Challenges in Networking Technologies: A Comprehensive

    This survey paper provides a comprehensive overview of emerging technologies in networking, focusing on caching in Information-Centric Networking (ICN), context-aware radio access technology (RAT) selection in 5G ultra-dense networks, cryptocurrency adoption, and mobility support for routing in Low Power and Lossy Networks (LLNs). Adaptive RAT selection mechanisms are stressed in 5G context ...

  11. CS 243, Fall 2019: Advanced Computer Networks

    Overview. This is a graduate-level course on computer networks. It provides a comprehensive overview on advanced topics in network protocols and networked systems. The course will cover both classic papers on computer networks and recent research results. It will examine a wide range of topics including routing, congestion control, network ...

  12. 290334 PDFs

    Computers, Materials & Continua, abbreviated as CMC, is an open access journal published by Tech Science Press. The journal publishes original research papers in the areas of computer networks ...

  13. 15 comprehensive networking research topics for students

    Sun, J. (2022) "Computer Network Security Technology and prevention strategy analysis," Procedia Computer Science, 208, pp. 570-576. Zhang, YLiang, R. and Ma, H. (2012) "Teaching innovation in computer network course for undergraduate students with a packet tracer," IERI Procedia, 2, pp. 504-510.

  14. Welcome

    Welcome. The Computer Networking Research Lab in the Department of Computer Sciences and Engineering at the University of Minnesota is led by Professor Zhi-Li Zhang. We conduct research on a broad range of topics related to computer and communication networks, helping transform the current best-effort Internet to a more reliable, available, and ...

  15. What Is Networking?

    A computer network comprises two or more computers that are connected—either by cables (wired) or wifi (wireless)—with the purpose of transmitting, exchanging, or sharing data and resources. You build a computer network by using hardware (for example, routers, switches, access points, and cables) and software (for example, operating systems ...

  16. Computer networks (article)

    Computer networks. The Internet is the world's largest computer network. Let's break that down: A computer network is any group of interconnected computing devices capable of sending or receiving data. A computing device isn't just a computer—it's any device that can run a program, such as a tablet, phone, or smart sensor.

  17. (PDF) Computer Networking: A Survey

    Computer networks are a. system of i nterconnected computers for the purpose of. sharing digital information. The computer network. enables to analyze, organize and disseminate the. information ...

  18. computer network Latest Research Papers

    Characteristic Measure. This paper aims to study the Countermeasures of big data security management in the prevention and control of computer network crime in the absence of relevant legislation and judicial practice. Starting from the concepts and definitions of computer crime and network crime, this paper puts forward the comparison matrix ...

  19. Journal of Computer Networks and Communications: Vol 2024, No 1

    A Systematic Review of Blockchain Technology Assisted with Artificial Intelligence Technology for Networks and Communication Systems. Kamal Kumar, Vinod Kumar, Seema, Mukesh Kumar Sharma, Akber Ali Khan, M. Javed Idrisi. , 9979371. First Published: 09 February 2024.

  20. Computer network

    A computer network is a set of computers sharing resources located on or provided by network nodes.Computers use common communication protocols over digital interconnections to communicate with each other. These interconnections are made up of telecommunication network technologies based on physically wired, optical, and wireless radio-frequency methods that may be arranged in a variety of ...

  21. Computer Networking Dissertation Topics

    Research Aim: Even though computer networking has been practised for a few years now, its importance has increased immensely over the past two years. A few main reasons include the use of technology by almost every business and the aim to offer customers an easy and convenient shopping experience. The main aim of this research will be to ...

  22. UCF Researcher Clearing the Way for Smart Wireless Networks

    Communicating unimpeded at distances near and far is a dream Murat Yuksel is hoping to realize. His ongoing research, titled "INWADE: INtelligent Waveform Adaptation with DEep Learning," and funded by the U.S. Air Force Research, aims to get us closer to that dream by improving the quality of high frequency wireless networks using machine learning

  23. DRL‐based customised resource allocation for sub‐slices in 6G network

    6G network services demand significant computer resources. Network slicing offers a potential solution by enabling customized services on shared infrastructure. However, dynamic service needs in heterogeneous environments pose challenges to resource provisioning. 6G applications like extended reality and connected vehicles require service ...

  24. PDF Computer Networking: a Review

    International Research Journal of Modernization in Engineering Technology and Science Volume:03/Issue:07/July-2021 Impact Factor- 5.354 www.irjmets.com ... A computer network is merely computers connected together through wires that permit them to share data or devices like hard drives, CD-ROMs, fax modems, printers, etc. Fig 1 provides an ...

  25. A Research on Computer Networking: Protocol, Challenges & Applications

    A group of computers sharing resources that are available on or offered by network nodes is known as a computer network. Via Digital links, Computers communicate with one another using standard ...

  26. Google Introduces Project Naptime for AI-Powered Vulnerability Research

    Google has developed a new framework called Project Naptime that it says enables a large language model (LLM) to carry out vulnerability research with an aim to improve automated discovery approaches. "The Naptime architecture is centered around the interaction between an AI agent and a target codebase," Google Project Zero researchers Sergei Glazunov and Mark Brand said.

  27. These are the Top 10 Emerging Technologies of 2024

    The list includes ways artificial intelligence is accelerating scientific research with a focus on applications in health, communication, infrastructure and sustainability. ... of over 300 world-leading academics and experts from the Forum's Global Future Councils, the University & Researcher Network and the Top 10 Emerging Technologies ...

  28. The Epistemological Consequences of Artificial Intelligence, Precision

    ABSTRACT I argue that this examination and appreciation for the shift to abductive reasoning should be extended to the intersection of neuroscience and novel brain-computer interfaces too. This paper highlights the implications of applying abductive reasoning to personalized implantable neurotechnologies. Then, it explores whether abductive reasoning is sufficient to justify insurance coverage ...