白丝美女被狂躁免费视频网站,500av导航大全精品,yw.193.cnc爆乳尤物未满,97se亚洲综合色区,аⅴ天堂中文在线网官网

Detecting and analyzing phishing attacks through artificial intelligence

專利號(hào)
US11997138B1
公開日期
2024-05-28
申請(qǐng)人
KING FAISAL UNIVERSITY(SA Al Hasa)
發(fā)明人
Ahmed Alyahya; Mohammed Alzahrani
IPC分類
H04L9/40
技術(shù)領(lǐng)域
phishing,email,persuasion,message,spam,emails,in,recipient,signs,or
地域: Al Hasa

摘要

Detection of phishing messages in network communications is performed by receiving a transmitted message and detecting characteristics of the message. A determination is made if the message matches a pattern of a phishing message in a database, and classifies the message as a phishing or spam message accordingly. If the message does not match a known phishing message pattern, the message is checked for common signs of phishing or spam by determining the severity of a threat embodied by the message, and the message is categorized as having phishing characteristics and according to the severity of threat. In response the user responses to determinations of threats, criteria for detection of phishing characteristics is adjusted, thereby automatically revising criteria for future decisions as to whether the message represents suspected phishing.

說明書

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17
  • 2—classification: Transformer. A Transformer is a neural network architecture that learns context from texts. One example is the Bidirectional Encoder Representations from Transformers (BERT language model) used in natural language processing. BERT was introduced in October 2018 by researchers at Google and is implanted by huggingface (Hugging Face, Inc., New York). A transformer is a deep learning architecture based on the multi-head machine learning attention mechanism. Machine learning-based attention is a mechanism which intuitively mimics cognitive attention. It calculates “soft” weights for each word, more precisely for its embedding, in the context window, and is used to train large language model (LLM) datasets. The use of BERT is given as a non-limiting example of transformer architecture, and any suitable transformer architecture can be used.
  • This is accomplished by email text processing, and content and link detection. Email text processing addresses text patterns and words, and includes text manipulation by converting text to lowercase and stripping out special characters, numbers, and stop words that could obfuscate the message intended to be conveyed by the text. By way of non-limiting examples, this would use a Masked Language Model (MLM) RegEx, and a text detection library such as Pandas, scikit-learn, Re and/or a Natural Language Toolkit (NLTK). Pandas is a software library written for the Python programming language for data manipulation and analysis. In particular, it offers data structures and operations for manipulating numerical tables and time series. Scikit-Learn is a free software machine-learning library for the Python programming language.

    權(quán)利要求

    1
    微信群二維碼
    意見反饋