Search results for ZPAQ Compression
Unfolding Remcos RAT- 4.9.2 Pro
Malware Analysis of Remcos RAT: Exploitaion and Detection ExplainedExecutive SummarySHA256 hash:2e5c4d023167875977767da513d8889f1fc09fb18fdadfd95c66a6a890b5ca3fRemcos is a commercially available Remote Access Tool (RAT) marketed for legitimate use in surveillance and penetration testing. However, it has been leveraged in various unauthorized hacking initiatives. When deployed, Remcos establishes a backdoor, allowing comprehensive remote control over the affected system. The tool is a product of BreakingSecurity, a company specializing in cybersecurity solutions.Hackers are getting smarter by using tricks like hiding their code and adding fake code, which makes it harder for security experts to figure out how their attacks work. They're using things like image files and compression to disguise their activities.YARA signature rules are attached in Appendix A. Malware sample and hashes have been submitted to VirusTotal for further examination.High-Level Technical SummaryRemcos is an advanced remote access tool that breaks into computers using a series of hidden codes, starting with a malicious file which can be delivered from mail or dropper. It cleverly disguises its next steps within an image file, and then uses another DLL to make sure it stays on the computer even after it's restarted. Remcos can record keystrokes to steal passwords and other private information, which it logs into a file. It stays in contact with the hacker's server to send out this stolen information and to get new orders, allowing the hacker to keep a close watch and control over the infected computer.Malware CompositionThis composition of remcos consists of the following components:2e5c4d023167875977767da513d8889f1fc09fb18fdadfd95c66a6a890b5ca3fEmbedded_Remcos.exeIn a C# dropper, there's a sneaky way that malware developers are hiding bad code. They put this code inside the InitializeComponent() method. This method is normally used just for setting up how the app looks, like buttons and menus. But now, it's being used to hide something harmful. The tricky part is that this bad code looks just like regular setup code, so it's hard to spot. It's like hiding something bad inside something good, so people don't notice it.This makes it hard to find and fix the problem. It's a clever trick by hackers, and it shows how they can use parts of an app we usually trust to do sneaky things.It is extracting a byte array from a resource, possibly a file or other data embedded in the application and generating another byte array from a hard coded string.The code is setting up a user interface for a form and then performing an operation on a data resource (“SHP”) using a generated key.The _data before the encryption looks like this.The for loop processes the Data_ array in a complex way. It goes through each byte of Data_ and modifies it based on a calculation involving both Data_ and KeyGen.The calculation inside the loop involves bitwise XOR (^), addition, and modulo operations. It appears to be some form of data manipulation or encryption/decryption, where Data_ is being altered using the KeyGen byte array.First, a MethodBase object named methodBase is assigned the value kb. The MethodBase class in C# is part of the reflection namespace and is used to discover information about methods (like constructors and other methods) at runtime.Then, an array of objects named array is created and initialized with string values. This array includes this.VC, this.VR, and the literal string “Boilerplate”. VC and VR are private string fields of the class, initialized to “57775972” and “6C7978”, respectively. Therefore, the array contains these two strings along with “Boilerplate”.Finally, the Invoke method on methodBase is called, passing obj and array as arguments. This means the method represented by methodBase is being executed with obj as the target and the string array as the parameters.Before the inoke there was binary loaded successfully in modules.And if look closely it in kb.Fullname it is calling dr,hA.wP method in Ben dll.Ben DLLBy adding breakpoint after loading from module we catch the debugger.The code performs image processing, uses reflection to invoke a method, and dynamically loads an assembly from a byte array. This kind of operation is typical in applications that need to manipulate images, dynamically execute code, and potentially load plugins or modules at runtime.Sleep for 16secnew MemoryStream(array2): This creates a new MemoryStream object using array2 as its buffer. array2 is assumed to be a byte array (byte[]) that contains data compressed using the GZip algorithm. The MemoryStream is a stream based on a memory buffer, allowing for reading from and writing to memory.new GZipStream(…): This creates a new GZipStream object. The GZipStream class is used to compress and decompress data in the GZip data format. In this case, it's constructed with the previously created MemoryStream and the CompressionMode.Decompress. This indicates that the GZipStream should be used for decompression, i.e., to decompress the data contained in array2.It's part of a process involving dynamic loading and reflection. It reads and possibly processes data from a MemoryStream, uses that data to load an assembly or access its contents, and then retrieves a specific type from that assembly.Rd is designed to dynamically load a .NET assembly from a byte array, denoted as u0020. It employs a nested, infinite loop structure with a switch statement for control flow. Initially, it attempts to load the assembly using Assembly.Load(u0020). The code's flow is influenced by the result of global::dr.hA.EV(), a method call whose purpose is unclear. If EV() returns a non-null value, the method exits the loop prematurely via a go to statement. The method's coding style, characterized by unconventional variable naming and complex looping, suggests a potential for obfuscation, possibly to conceal the actual functionality or make reverse-engineering more challenging.ReactionDiffusionAfter loading assembly we get a new binary in modules with the name of ReactionDiffusion.Then it disposes the “memorystream” which means the work of the memory stream is done here. Probably it will now move on to the next binary.After that it also dispose the gzip stream which was used to get the binary.Now let's track where it would go next in ReactionDiffusion. If we investigate the object where it is pointing its type show us the destination namespace and class.Since there were no method calls from previous binary. So, we created break point at constructor at it hit exact on it.There was nothing useful in ReactionDiffusion there, maybe it was all decoy code. Let's see what next the Ben binary does, in case 8 it gets bitmap from resources.RS MethodThe RS method in C# is designed to retrieve a Bitmap image from resources using reflection and obfuscated code patterns. It starts by declaring a ResourceManager to access embedded resources, using a dynamically constructed resource name from the first-string parameter, u0020. This parameter, along with a similarly named second parameter, is used in a nested, infinite loop structure with a switch statement. Bitmap is obtained by the method global::dr.hA.rY, which likely extracts the image from the resources. The control flow includes checks with global::dr.hA.EV() and global::dr.hA.m3(), whose purposes are unclear, but they seem to influence the flow and decision-making within the method. The use of obfuscated names (like u0020) and complex control flow suggests an intent to mask the code's functionality or purpose.Loading the assembly from byte arrayIt defines a private static method named Rd that takes a byte array u0020 as its parameter.It initializes an integer variable num with the value 1.Inside an infinite loop (for (;;)), the code performs the following actions: a. It declares a variable num2 and assigns it the value of num. b. It enters another loop (for (;;)). c. Within the inner loop, there is a switch statement with two cases:Case 1:It attempts to load an assembly using Assembly.Load(u0020), where u0020 represents the byte array passed as a parameter to the method.If the assembly is successfully loaded, it sets num2 to 0.It then checks whether global::dr.hA.EV() is not null. If it's not null, the code proceeds to the Block_1 label.If global::dr.hA.EV() is null, it effectively exits the loop and returns the loaded assembly.Default case:If none of the cases match, it returns the assembly variable, which would have been assigned earlier in the code. d. The Block_1 label is used to indicate the point where the code should continue if global::dr.hA.EV() is not null. It doesn't contain any specific code logic in the provided snippet.TyroneIt looks like another binary is coming. Another DLL loaded in modules with the name Tyrone.Invoking AJBqklj3Jn from tyorne { YcMqTyPiynJnoycycL.MhMHeAYqAZ6AJWSu3o}This is more obfuscated than previous binaries.Checking for the presence of a named mutex, which may be used by malware for synchronization or coordination purposes. “wnmJOXavioKPdkNYG”It tried to open but since if there is no mutex it goes to exception. If it exists it will end itself in second line.Creating MutexIt creates a new Mutex object with the name “wnmJOXavioKPdkNYG”. Mutexes are synchronization primitives used to control access to shared resources among multiple threads or processes.This was all to get path of appdata and then append it with “EiHjExP.exe”.“C:UsersusernameAppDataRoamingEiHjExP.exe”Check if not there Copy it.Change Directory PermissionIt adds access control entries to the directorySecurity object using the MhMHeAYqAZ6AJWSu3o.PR6qMi9p2U method. These entries seem to define permissions for specific file system rights (e.g., Read, ReadAndExecute, Delete, Write, etc.) with different access control types (e.g., Allow, Deny). The permissions are set for various inheritance flags and propagation flags, which determine how permissions are inherited by child objects.It removes “currentuser” security to change file and write permission.As you can see the permission are denied nowRemcos is doing this because it makes it safe from being changed or deleted from disc.Then it gets a base64 encoded text fetched from modules of this tyrone binary with this code.I decode this string from https://www.base64decode.org/ and it turns out that it is xml.There is code for decoding also in the remcos.Then this function is called to play with Microsoft Security. This function decodes the text which was fetched from module.It then creates a new process, assign a new stratinfo with it and give file name “powershell” which it gets from the module. In arguments of process, it gives @”Add-MpPreference -ExclusionPath “”C:UsersshaddyAppDataRoamingEiHjExP.exe”””Set process's window hidden.Windows ExclusionIt will be added to the exclusion but keep in mind that I was running it from admin, if not performing analysis from admin it will be able to add since so far there was not privilege escalation performed.Path.GetTempFileName(); it will return a string that represents a unique temporary file name. This file name is generated using a combination of a temporary directory path and a unique identifier, making it highly unlikely to clash with other temporary files in the system.It gets the identity of current user, exe path to update the xml. In the breakpoint it is updating the xml and saving it in text variable.The clean xml code.PersistenceAfter that it is writing all xml in tmp file.It then loads the command of scheduling task from modules and sets startupinfo of process. Process is executed with window style hidden, Filename “schtask.exe” and with following arguments.@”/Create /TN “”UpdatesEiHjExP”” /XML “”C:UsersshaddyAppDataLocalTemptmp66E3.tmp”””This command appears to be creating a new scheduled task with the name “UpdatesEiHjExP” and configuring it using an XML file located at “C:UsersshaddyAppDataLocalTemptmp66E3.tmp.”It is triggering the exe after every system restarts.Then it deletes the tmp file.After that it loads new assembly “xF7siMsac” from its resource manager.It is injecting this final binary and executing it. Let's see its injection inside process hacker.Another binary which is extracted and DE obfuscated from resources.Remcos / 5thstageAfter saving the binary from u0020 it looks exactly like client agent built from the original remcos agent from hxxps://breakingsecurity.net/remcos/. The logo is also the same, but its signature was not present in any online threat intelligence.https://www.virustotal.com/gui/search/f55fc4f4e1bcbe957d20750f56cd98869c717c18c14c8b6d42698557b254ad51This final stage was developed in c++ language. And before analysis when we perform strings filter there was something linking to remcos, this pattern comes almost in every remcos rat.Now let's start the debugger to look more into it. We can see some more identifications.It starts with calling GetAddrInfoW API which is pointing to rungmotors20.ddns.net:60247.GetAddrInfoW is a Windows API function that is used for network operations. It's part of the Windows Sockets (Winsock) API and is typically called to resolve network addresses or to perform name resolution, converting a hostname like a domain or a URL into an IP address that can be used to establish network connections.If running from admin privileges, it creates a directory [C:\ProgramData\remcos] using CreateDirectoryW API.CreateDirectoryW is a function in the Windows API that is used to create a new directory. The W at the end of CreateDirectoryW indicates that this function uses wide characters (Unicode), as opposed to CreateDirectoryA, which uses ANSI characters.After creating Directory, it creates file with name logs.dat using CretaeFileW api.There are privileges check also it is handling both cases smoothly. It is just paths which it used separately.While executed from admin it uses [ C:\ProgramData\remcos folder]. It creates thread and that thread in loop performs these steps.If executed from normal permission, it uses [C:\Users\username\Local\VirtualStore\ProgramData\remcos\logs.dat]It sets its mark on the system in registry. It sets exepath, licence and time for thread.Patching TLSAll traffic was encrypted so we must check what is being sent. There was TLS check which was on in our client rate.Since we cannot see what it is sending to server, because of TLS flag is on. It will send all the traffic encrypted. After patching this, we can analyze the traffic.After finding the check I was able to turn off the TLS and see all the traffic clearly. It was sending the device identification after every few seconds to server.This was sample data that rat was sending.$ KRemoteHost||DESKTOP-002IHON/shaddy||US||Windows 10 Enterprise (64 bit)||||8588939264||4.9.2 Pro||C:ProgramDataremcoslogs.dat||C:UsersshaddyDesktop5thstage.exe||||5thstage.exe — PID: 3308 — Module: 5thstage.exe — Thread: Main Thread 6232 — x32dbg [Elevated]||1||47||48556593||1||rungmotors20.ddns.net||Rmc-ZT6SIL||0||C:UsersshaddyDesktop5thstage.exe||12th Gen Intel(R) Core(TM) i7–12700KF||Exe||||Clipboard and Process recordingInside the thread it was performing three major activities because the one who built it, he/she only want to record clipboards, records keylogging and setting some registries. It records all the clipboards data inside the same logs.dat file. Only it appends [Text copied to clipboard] at initial and [End of clipboard] at end.It also keeps recording the process which spawns, its architecture, its user access and all the keystrokes also.Rules & IOCsYara Rulesrule remcos_pro_4_9_2{meta:author = “Osama Ellahi”description = “Remcos RAT 4.9.2 pro version from breakpoint”strings:$string_match1 = “© by P.J. Plauger, licensed by Dinkumware, Ltd. ALL RIGHTS RESERVED” ascii fullword$string_match2 = “tRemcos v” ascii fullword$string_match3 = “BreakingSecurity.net” ascii fullword$string_match4 = “4.9.2 Pro” ascii fullword$string_match6 = “[Text pasted from clipboard]” ascii fullword$string_match7 = “[End of clipboard]” ascii fullword$string_match8 = “[End of clipboard]” ascii fullword$string_match9 = “[Text copied to clipboard]” ascii fullword$string_match11 = “Offline Keylogger Started” ascii fullword$string_match12 = “Offline Keylogger Stopped” ascii fullword$string_match13 = “Online Keylogger Started” ascii fullword$string_match14 = “Online Keylogger Stopped” ascii fullword$string_match15 = “Remcos restarted by watchdog!” ascii fullword$string_match16 = “Watchdog module activated” ascii fullword$string_match17 = “Watchdog launch failed!” ascii fullword$string_match18 = “[Chrome StoredLogins not found]” ascii fullword$string_match19 = “[Chrome StoredLogins found, cleared!]” ascii fullword$string_match20 = “[Chrome Cookies not found]” ascii fullword$string_match21 = “[Chrome Cookies found, cleared!]” ascii fullword$string_match22 = “[Firefox StoredLogins not found]” ascii fullword$string_match23 = “[Firefox Cookies not found]” ascii fullword$string_match24 = “[Firefox cookies found, cleared!]” ascii fullword$string_match25 = “[Firefox StoredLogins Cleared!]” ascii fullword$string_match26 = [IE cookies not found] ascii fullword$string_match27 = [IE cookies cleared!] ascii fullword$string_match28 = [Cleared browsers logins and cookies.] ascii fullword$string_paths1 = “\AppData\Local\Google\Chrome\User Data\Default\Cookies” ascii fullword$string_paths2 = “\AppData\Roaming\Mozilla\Firefox\Profiles\” ascii fullword$string_paths3 = “Software\Microsoft\Windows\CurrentVersion\Explorer\User Shell Folders” ascii fullword$string_paths4 = “Software\Microsoft\Windows\CurrentVersion\Run\” ascii fullword$string_paths5 = = “\AppData\Local\Google\Chrome\User Data\Default\Login Data” ascii fullword$string_paths6 = “Software\Microsoft\EventSounds\Sounds” ascii fullword$string_paths7 = “System\CurrentControlSet\Control\MediaProperties\PrivateProperties\Joystick\Winmm” ascii fullword$string_commands1 = “CreateObject(”WScript.Shell”).Run ”cmd /c ””” ascii fullword$string_commands2 = “CreateObject(”Scripting.FileSystemObject”).DeleteFile(Wscript.ScriptFullName)” ascii fullword$string_commands3 = “\AppData\Local\Google\Chrome\User Data\Default\Login Data” ascii fullword$string_commands4 = “/k %windir%\System32\reg.exe ADD HKLM\SOFTWARE\Microsoft\Windows\CurrentVersion\Policies\System /v EnableLUA /t REG_DWORD /d 0 /f” ascii$string_url1 = “http://geoplugin.net/json.gp" ascii fullword$string_url2 = “rungmotors20.ddns.net” ascii fullwordcondition:uint16(0) == 0x5a4d and filesize < 600KB and filesize >200KBand(any of ($string_url*)or3 of ($string_paths*)or5 of ($string_match*))}Callback URLsURL: rungmotors20.ddns.net Port: 60247URL: hxxp://geoplugin.net/json.gp Port: 443IOC1stSHA256 —2e5c4d023167875977767da513d8889f1fc09fb18fdadfd95c66a6a890b5ca3f2ndMD5 —3125f77575829f3b710f5a15912dec20 *stage2.dllSHA256 —1cc58fba1d1b4c7e0b9d752ea7f03fa3c312ae2fc53796d5b3acea98e6ea3c0e *stage2.dll3rdSHA256 —d01f3dea3851602ba5a0586c60430d286adf6fcc7e17aab080601a66630606e5 *stage3.dllMD5 —579197d4f760148a9482d1ebde113259 *stage3.dll4thSHA256 —c5928572e371b0a5d3109d0a7431ca9e064216beb858f04dc8d0140ccaf44b84 *Tyrone.dllMD5 —dd76e11ff9b96efdcf3cd377126c8d96 *Tyrone.dll5thSHA256 —f55fc4f4e1bcbe957d20750f56cd98869c717c18c14c8b6d42698557b254ad51 *5thstage.malMD5 —dc05d4f2864dfafa9b91e8e0d79840e3 *5thstage.malReferenceshttps://www.joesandbox.com/analysis/1339230/0/htmlhttps://www.jaiminton.com/reverse-engineering/remcos#part-2-decompiling-binaryUnfolding Remcos RAT- 4.9.2 Pro was originally published in InfoSec Write-ups on Medium, where people are continuing the conversation by highlighting and responding to this story.

Unfolding Remcos RAT- 4.9.2 Pro
Malware Analysis of Remcos RAT: Exploitaion and Detection ExplainedExecutive SummarySHA256...
Source: InfoSec Write-ups
New Agent Tesla Malware Variant Using ZPAQ Compression in Email Attacks
A new variant of the Agent Tesla malware has been observed delivered via a lure file with the ZPAQ compression format to harvest data from several email clients and nearly 40 web browsers. "ZPAQ is a file compression format that offers a better compression ratio and journaling function compared to widely used formats like ZIP and RAR," G Data malware analyst Anna Lvova said in a Monday analysis.

New Agent Tesla Malware Variant Using ZPAQ Compression...
A new variant of the Agent Tesla malware has been observed delivered via a lure...
Source: The Hacker News
SysWings - Cloud & Managed services
Founded in 2017 to support startups in their IT strategy, in France and abroad, SysWings has extended its activities to the cloud and managed services. The team is made up of heterogeneous profiles, mixing employees and consultants, scaled according to your projects.

SysWings - Cloud & Managed services
Founded in 2017 to support startups in their IT strategy, in France and abroad,...
Crawlector - Threat Hunting Framework Designed For Scanning Websites For Malicious Objects
Crawlector (the name Crawlector is a combination of Crawler & Detector) is a threat hunting framework designed for scanning websites for malicious objects. Note-1: The framework was first presented at the No Hat conference in Bergamo, Italy on October 22nd, 2022 (Slides, YouTube Recording). Also, it was presented for the second time at the AVAR conference, in Singapore, on December 2nd, 2022. Note-2: The accompanying tool EKFiddle2Yara (is a tool that takes EKFiddle rules and converts them into Yara rules) mentioned in the talk, was also released at both conferences. Features Supports spidering websites for findings additional links for scanning (up to 2 levels only) Integrates Yara as a backend engine for rule scanning Supports online and offline scanning Supports crawling for domains/sites digital certificate Supports querying URLhaus for finding malicious URLs on the page Supports hashing the page's content with TLSH (Trend Micro Locality Sensitive Hash), and other standard cryptographic hash functions such as md5, sha1, sha256, and ripemd128, among others TLSH won't return a value if the page size is less than 50 bytes or not "enough amount of randomness" is present in the data Supports querying the rating and category of every URL Supports expanding on a given site, by attempting to find all available TLDs and/or subdomains for the same domain This feature uses the Omnisint Labs API (this site is down as of March 10, 2023) and RapidAPI APIs TLD expansion implementation is native This feature along with the rating and categorization, provides the capability to find scam/phishing/malicious domains for the original domain Supports domain resolution (IPv4 and IPv6) Saves scanned websites pages for later scanning (can be saved as a zip compressed) The entirety of the framework's settings is controlled via a single customizable configuration file All scanning sessions are saved into a well-structured CSV file with a plethora of information about the website being scanned, in addition to information about the Yara rules that have triggered All HTTP(S) communications are proxy-aware One executable Written in C++ URLHaus Scanning & API Integration This is for checking for malicious urls against every page being scanned. The framework could either query the list of malicious URLs from URLHaus server (configuration: url_list_web), or from a file on disk (configuration: url_list_file), and if the latter is specified, then, it takes precedence over the former. It works by searching the content of every page against all URL entries in url_list_web or url_list_file, checking for all occurrences. Additionally, upon a match, and if the configuration option check_url_api is set to true, Crawlector will send a POST request to the API URL set in the url_api configuration option, which returns a JSON object with extra information about a matching URL. Such information includes urlh_status (ex., online, offline, unknown), urlh_threat (ex., malware_download), urlh_tags (ex., elf, Mozi), and urlh_reference (ex., https://urlhaus.abuse.ch/url/1116455/). This information will be included in the log file cl_mlog_<current_date><current_time><(pm|am)>.csv (check below), only if check_url_api is set to true. Otherwise, the log file will include the columns urlh_url (list o f matching malicious URLs) and urlh_hit (number of occurrences for every matching malicious URL), conditional on whether check_url is set to true. URLHaus feature could be disabled in its entirety by setting the configuration option check_url to false. It is important to note that this feature could slow scanning considering the huge number of malicious urls (~ 130 million entries at the time of this writing) that need to be checked, and the time it takes to get extra information from the URLHaus server (if the option check_url_api is set to true). Files and Folders Structures cl_sites this is where the list of sites to be visited or crawled is stored. supports multiple files and directories. crawled where all crawled/spidered URLs are saved to a text file. certs where all domains/sites digital certificates are stored (in .der format). results where visited websites are saved. pg_cache program cache for sites that are not part of the spider functionality. cl_cache crawler cache for sites that are part of the spider functionality. yara_rules this is where all Yara rules are stored. All rules that exist in this directory will be loaded by the engine, parsed, validated, and evaluated before execution. cl_config.ini this file contains all the configuration parameters that can be adjusted to influence the behavior of the framework. cl_mlog_<current_date><current_time><(pm|am)>.csv log file that contains a plethora of information about visited websites date, time, the status of Yara scanning, list of fired Yara rules with the offsets and lengths of each of the matches, id, URL, HTTP status code, connection status, HTTP headers, page size, the path to a saved page on disk, and other columns related to URLHaus results. file name is unique per session. cl_offl_mlog_<current_date><current_time><(pm|am)>.csv log file that contains information about files scanned offline. list of fired Yara rules with the offsets and lengths of the matches, and path to a saved page on disk. file name is unique per session. cl_certs_<current_date><current_time><(pm|am)>.csv log file that contains a plethora of information about found digital certificates expandedexp_subdomain_<pm|am>.txt contains discovered subdomains (part of the [site] section) expandedexp_tld_<pm|am>.txt contains discovered domains (part of the [site] section) Configuration File (cl_config.ini) It is very important that you familiarize yourself with the configuration file cl_config.ini before running any session. All of the sections and parameters are documented in the configuration file itself. The Yara offline scanning feature is a standalone option, meaning, if enabled, Crawlector will execute this feature only irrespective of other enabled features. And, the same is true for the crawling for domains/sites digital certificate feature. Either way, it is recommended that you disable all non-used features in the configuration file. Depending on the configuration settings (log_to_file or log_to_cons), if a Yara rule references only a module's attributes (ex., PE, ELF, Hash, etc...), then Crawlector will display only the rule's name upon a match, excluding offset and length data. Sites Format Pattern To visit/scan a website, the list of URLs must be stored in text files, in the directory “cl_sites”. Crawlector accepts three types of URLs: Type 1: one URL per line Crawlector will assign a unique name to every URL, derived from the URL hostname Type 2: one URL per line, with a unique name [a-zA-Z0-9_-]{1,128} = <url> Type 3: for the spider functionality, a unique format is used. One URL per line is as follows: <id>[depth:<0|1>-><d+>,total:<d+>,sleep:<d+>] = <url> For example, mfmokbel[depth:1->3,total:10,sleep:0] = https://www.mfmokbel.com which is equivalent to: mfmokbel[d:1->3,t:10,s:0] = https://www.mfmokbel.com where, <id> := [a-zA-Z0-9_-]{1,128} depth, total and sleep, can also be replaced with their shortened versions d, t and s, respectively. depth: the spider supports going two levels deep for finding additional URLs (this is a design decision). A value of 0 indicates a depth of level 1, with the value that comes after the “->” ignored. A depth of level-1 is controlled by the total parameter. So, first, the spider tries to find that many additional URLs off of the specified URL. The value after the “->” represents the maximum number of URLs to spider for each of the URLs found (as per the total parameter value). A value of 1, indicates a depth of level 2, with the value that comes after the “->” representing the maximum number of URLs to find, for every URL found per the total parameter. For clarification, and as shown in the example above, first, the spider will look for 10 URLs (as specified in the total parameter), and then, each of those found URLs will be spidered up to a max of 3 URLs; therefore, and in the best-case scenario, we would end up with 40 (10 + (10*3)) URLs. The sleep parameter takes an integer value representing the number of milliseconds to sleep between every HTTP request. Note 1: Type 3 URL could be turned into type 1 URL by setting the configuration parameter live_crawler to false, in the configuration file, in the spider section. Note 2: Empty lines and lines that start with “;” or “//” are ignored. The Spider Functionality The spider functionality is what gives Crawlector the capability to find additional links on the targeted page. The Spider supports the following featuers: The domain has to be of Type 3, for the Spider functionality to work You may specify a list of wildcarded patterns (pipe delimited) to prevent spidering matching urls via the exclude_url config. option. For example, *.zip|*.exe|*.rar|*.zip|*.7z|*.pdf|.*bat|*.db You may specify a list of wildcarded patterns (pipe delimited) to spider only urls that match the pattern via the include_url config. option. For example, */checkout/*|*/products/* You may exclude HTTPS urls via the config. option exclude_https You may account for outbound/external links as well, for the main page only, via the config. option add_ext_links. This feature honours the exclude_url and include_url config. option. You may account for outbound/external links of the main page only, excluding all other urls, via the config. option ext_links_only. This feature honours the exclude_url and include_url config. option. Site Ranking Functionality This is for checking the ranking of the website You give it a file with a list of websites, with their ranking, in a csv file format Services that provide lists of websites ranking include, Alexa top-1m (discontinued as of May 2022), Cisco Umbrella, Majestic, Quantcast, Farsight and Tranco, among others CSV file format (2 columns only): first column holds the ranking, and the second column holds the domain name If a cell to contain quoted data, it'll be automatically dequoted Line breaks aren't allowed in quoted text Leading and trailing spaces are trimmed from cells read Empty and comment lines are skipped The section site_ranking in the configuration file provides some options to alter how the CSV file is to be read The performance of this query is dependent on the number of records in the CSV file Crawlector compares every entry in the CSV file against the domain being investigated, and not the other way around Only the registered/pay-level domain is compared Finding TLDs and Subdomains - [site] Section The site section provides the capability to expand on a given site, by attempting to find all available top-level domains (TLDs) and/or subdomains for the same domain. If found, new tlds/subdomains will be checked like any other domain This feature uses the Omnisint Labs (https://omnisint.io/) and RapidAPI APIs Omnisint Labs API returns subdomains and tlds, whereas RapidAPI returns only subdomains (the Omnisint Labs API is down as of March 10, 2023, however, the implementation is still available in case the site is back up) For RapidAPI, you need a valid "Domains records" API key that you can request from RapidAPI, and plug it into the key rapid_api_key in the configuration file With find_tlds enabled, in addition to Omnisint Labs API tlds results, the framework attempts to find other active/registered domains by going through every tld entry, either, in the tlds_file or tlds_url If tlds_url is set, it should point to a url that hosts tlds, each one on a new line (lines that start with either of the characters ';', '#' or '//' are ignored) tlds_file, holds the filename that contains the list of tlds (same as for tlds_url; only the tld is present, excluding the '.', for ex., "com", "org") If tlds_file is set, it takes precedence over tlds_url tld_dl_time_out, this is for setting the maximum timeout for the dnslookup function when attempting to check if the domain in question resolves or not tld_use_connect, this option enables the functionality to connect to the domain in question over a list of ports, defined in the option tlds_connect_ports The option tlds_connect_ports accepts a list of ports, comma separated, or a list of ranges, such as 25-40,90-100,80,443,8443 (range start and end are inclusive) tld_con_time_out, this is for setting the maximum timeout for the connect function tld_con_use_ssl, enable/disable the use of ssl when attempting to connect to the domain If save_to_file_subd is set to true, discovered subdomains will be saved to "expandedexp_subdomain_<pm|am>.txt" If save_to_file_tld is set to true, discovered domains will be saved to "expandedexp_tld_<pm|am>.txt" If exit_here is set to true, then Crawlector bails out after executing this [site] function, irrespective of other enabled options. It means found sites won't be crawled/spidered Design Considerations A URL page is retrieved by sending a GET request to the server, reading the server response body, and passing it to Yara engine for detection. Some of the GET request attributes are defined in the [default] section in the configuration file, including, the User-Agent and Referer headers, and connection timeout, among other options. Although Crawlector logs a session's data to a CSV file, converting it to an SQL file is recommended for better performance, manipulation and retrieval of the data. This becomes evident when you're crawling thousands of domains. Repeated domains/urls in the cl_sites are allowed. Limitations Single threaded Static detection (no dynamic evaluation of a given page's content) No headless browser support, yet! Third-party libraries used Chilkat: library for website spidering, HTTP communications, hashing, JSON parsing and file compression (ZIP), among others Yara: for rule scanning (v4.2.3) CrossGuid: for generating GUID/UUID Inih: for parsing configuration file Rapidcsv: for parsing CSV files Color Console: for console coloring TLSH (Trend Micro Locality Sensitive Hash) (v4.8.2) Contributing Open for pull requests and issues. Comments and suggestions are greatly appreciated. Author Mohamad Mokbel (@MFMokbel) Download Crawlector

Crawlector - Threat Hunting Framework Designed For...
Crawlector (the name Crawlector is a combination of Crawler & Detector) is...
Source: KitPloit
Google links WinRAR exploitation to multiple state hacking groups
Google says multiple state-backed hacking groups are gaining arbitrary code execution on targets' systems by exploiting a high-severity vulnerability in WinRAR, a compression software with over 500 million users. [...]

Google links WinRAR exploitation to multiple state...
Google says multiple state-backed hacking groups are gaining arbitrary code execution...
Source: BleepingComputer
Google links WinRAR exploitation to Russian, Chinese state hackers
Google says that several state-backed hacking groups have joined ongoing attacks exploiting a high-severity vulnerability in WinRAR, a compression software used by over 500 million users, aiming to gain arbitrary code execution on targets' systems. [...]

Google links WinRAR exploitation to Russian, Chinese...
Google says that several state-backed hacking groups have joined ongoing attacks exploiting...
Source: BleepingComputer
Pro-Russian Hackers Exploiting Recent WinRAR Vulnerability in New Campaign
Pro-Russian hacking groups have exploited a recently disclosed security vulnerability in the WinRAR archiving utility as part of a phishing campaign designed to harvest credentials from compromised systems. "The attack involves the use of malicious archive files that exploit the recently discovered vulnerability affecting the WinRAR compression software versions prior to 6.23 and traced as

Pro-Russian Hackers Exploiting Recent WinRAR Vulnerability...
Pro-Russian hacking groups have exploited a recently disclosed security vulnerability...
Source: The Hacker News
Update Chrome Now: Google Releases Patch for Actively Exploited Zero-Day Vulnerability
Google on Wednesday rolled out fixes to address a new actively exploited zero-day in the Chrome browser. Tracked as CVE-2023-5217, the high-severity vulnerability has been described as a heap-based buffer overflow in the VP8 compression format in libvpx, a free software video codec library from Google and the Alliance for Open Media (AOMedia). Exploitation of such buffer overflow flaws can

Update Chrome Now: Google Releases Patch for Actively...
Google on Wednesday rolled out fixes to address a new actively exploited zero-day...
Source: The Hacker News
Modern GPUs vulnerable to new GPU.zip side-channel attack
Researchers from four American universities have developed a new GPU side-channel attack that leverages data compression to leak sensitive visual data from modern graphics cards when visiting web pages. [...]
American universities attack Researchers graphics cards sensitive visual

Modern GPUs vulnerable to new GPU.zip side-channel...
Researchers from four American universities have developed a new GPU side-channel...
Source: BleepingComputer
Researchers Uncover New GPU Side-Channel Vulnerability Leaking Sensitive Data
A novel side-channel attack called GPU.zip renders virtually all modern graphics processing units (GPU) vulnerable to information leakage. "This channel exploits an optimization that is data dependent, software transparent, and present in nearly all modern GPUs: graphical data compression," a group of academics from the University of Texas at Austin, Carnegie Mellon University, University of

Researchers Uncover New GPU Side-Channel Vulnerability...
A novel side-channel attack called GPU.zip renders virtually all modern graphics...
Source: The Hacker News
HTMLSmuggler - HTML Smuggling Generator And Obfuscator For Your Red Team Operations
The full explanation what is HTML Smuggling may be found here. The primary objective of HTML smuggling is to bypass network security controls, such as firewalls and intrusion detection systems, by disguising malicious payloads within seemingly harmless HTML and JavaScript code. By exploiting the dynamic nature of web applications, attackers can deliver malicious content to a user's browser without triggering security alerts or being detected by traditional security mechanisms. Thanks to this technique, the download of a malicious file is not displayed in any way in modern IDS solutions. The main goal of HTMLSmuggler tool is creating an independent javascript library with embedded malicious user-defined payload. This library may be integrated into your phishing sites/email html attachments/etc. to bypass IDS and IPS system and deliver embedded payload to the target user system. An example of created javascript library may be found here. Features Built-in highly configurable JavaScript obfuscator that fully hides your payload. May be used both as an independent JS library or embedded in JS frameworks such as React, Vue.js, etc. The simplicity of the template allows you to add extra data handlers/compressions/obfuscations. Installation Install yarn package manager. Install dependencies: yarn Read help message. yarn build -h Usage Preparation steps Modify (or use my) javascript-obfuscator options in obfuscator.js, my preset is nice, but very slow. Compile your javascript payload: yarn build -p /path/to/payload -n file.exe -t "application/octet-stream" -c Get your payload from dist/payload.esm.js or dist/payload.umd.js. After that, it may be inserted into your page and called with download() function. payload.esm.js is used in import { download } from 'payload.esm'; imports (ECMAScript standart). payload.umd.js is used in html script SRC and require('payload.umd'); imports (CommonJS, AMD and pure html). Pure HTML example A full example may be found here. Do preparation steps. Import created script to html file (or insert it inline): <head> <script src="payload.umd.js"></script></head> Call download() function from body: <body> <button onclick="download()">Some phishy button</button></body> Happy phishing :) VueJS example A full example may be found here. Do preparation steps. Import created script to vue file: <script> import { download } from './payload.esm';</script> Call download() function: <template> <button @click="download()">Some phishy button</button></template> Happy phishing :) FAQ Q: I have an error RangeError: Maximum call stack size exceeded, how to solve it? A: This issue described here. To fix it, try to disable splitStrings in obfuscator.js or make smaller payload (it's recommended to use up to 2 MB payloads because of this issue). Q: Why does my payload build so long? A: The bigger payload you use, the longer it takes to create a JS file. To decrease time of build, try to disable splitStrings in obfuscator.js. Below is a table with estimated build times using default obfuscator.js. Payload size Build time 525 KB 53 s 1.25 MB 8 m 3.59 MB 25 m Download HTMLSmuggler

HTMLSmuggler - HTML Smuggling Generator And Obfuscator...
The full explanation what is HTML Smuggling may be found here. The primary objective...
Source: KitPloit
WinRAR ZIP Arbitrary Code Execution Vulnerability (CVE-2023-38831)
What is WinRAR? WinRAR is a popular utility tool for file compression/decompression and archive management. What is the Attack? CVE-2023-38831 is an arbitrary code execution vulnerability that affects WinRAR before version 6.23. The vulnerability allows threat actors to create a zip file that contains a folder and a file with the same filename. Opening (some refer to this as "viewing") the file launches a malicious script in the folder. Why is this Significant? This is significant because WinRAR is widely used and CVE-2023-38831 was reportedly exploited as a 0-day in April 2023. As a result, multiple malware families have reportedly been deployed. FortiGuard Labs strongly recommends all users of WinRAR to update to the latest version of WinRAR as soon as possible. What is the Vendor Solution? The vendor has released WinRAR version 6.23 that includes a fix for CVE-2023-38831. What FortiGuard Coverage is available? FortiGuard Labs has the following AV signatures against the files reportedly used in attacks involving CVE-2023-38831: W32/Darkme.A!tr W32/NDAoF PossibleThreat.DU W32/VB_AGen.EX!tr W32/ETCH!tr NSIS/Injector.15D3!tr PossibleThreat.FORTIEDR.H W32/PossibleThreat Malicious_Behavior.SB Webfiltering blocks all reported network IOCs.
WinRAR ZIP Arbitrary Code Execution Vulnerability...
What is WinRAR?
WinRAR is a popular utility tool for file compression/decompression...
Source: FortiGuard Labs | FortiGuard Center - Threat Signal Report