Avoiding Vulnerabilities and Attacks with a Proactive
Strategy for Web Applications
Volume 3 - Issue 2
Shahzad Ashraf*
- College of Internet of Things Engineering, Hohai University Changzhou, China
Received:August 23, 2021 Published: August 30, 2021
Corresponding author:Shahzad Ashraf, College of Internet of Things Engineering, Hohai University Changzhou, China
DOI: 10.32474/ARME.2021.03.000157
Fulltext
PDF
To view the Full Article Peer-reviewed Article PDF
Abstract
As the number of users interacting with dark websites grows, it opens the door for vulnerable and malevolent actors, making
web traffic unsafe and risky. To prevent such vulnerabilities and maleficent activities on dark websites the proactive strategic
measures have been taken into account and the relevant hidden causes are explored that helps to overcome the security risks
during various web operations. In the first step, from the dark web corpus, the web addresses have been analyzed to check the
status of whether these web addresses are available or not. To prevent the web addresses mining challenges a script was designed
to mines irrelevant web address URL by visiting multiple search engines based on user input. In the 2nd step, another script was
designed to check those domains having chances of becoming inactive because for security reasons such as onion sites. In 3rd step,
various gape has been identified in dark web hosting using crawls that create the new links from configuration files. In the 4th step,
using manual and automated testing the maleficent activities were identified in web traffic. Further proceeding to the 5th step,
the web address lifespan was determined which quantifies the duration between the first and last occurrences of a web address.
Finally, using Fisher’s Exact Test (FET), two comparative scenarios have been developed by considering the similar attributes and
the role of operating system interaction with surface and dark websites. In the first scenario for identifying the similar attributes
of surface and dark websites, the role of maleficent and spammer has been investigated and found that overall, 86 and 800%
of attributes of surface and dark websites are identical. Similarly, for 2nd scenario identifying how long the operating systems
have interacted with surface and dark websites, it was found that windows, Linux, and android based operating systems have an
incredible role and made the contents much pusillanimous which creates high chances of information leakage. In the end, up to 40
days of user interaction to surface and dark web has been analyzed and found various aggravated statistics regarding vulnerabilities
involvement in network traffic such as maleficent, spammer and the information leakage. At the same time, the interaction period
of operating systems with surface and dark websites such as windows, Linux, and Android is also statistically investigated. While
gathering the aforementioned investigation it is observed that most of the websites use CMS, such as WordPress, Joomla, Drupal,
and various forums, and are outdated with either no patching or having vulnerabilities. Since either, they hosted with old versions of
the software or were not updated with the latest patches, most URLs in the dark web are vulnerable to attacks. After this study, clear
and up-to-date statistics are unveiled regarding dark websites, and it is recommended that to get rid of vulnerabilities the obtained
statistics can be considered before developing new applications.
Keywords: Meticulous Testing; Vulnerabilities; Webhosting; Surface Web; Dark Web; Problematic Target; CMS; Script; FET
Abstract|
Introduction|
Related Work|
Method|
Challenges|
Scenario|
Results and Discussion|
Conclusion|
References|