GIF89a=( õ' 7IAXKgNgYvYx\%wh…hŽth%ˆs%—x¨}9®Œ©€&©‰%¶†(¹–.¹5·œD¹&Çš)ÇŸ5ǘ;Í£*È¡&Õ²)ׯ7×µ<Ñ»4ï°3ø‘HÖ§KͯT÷¨Yÿšqÿ»qÿÔFØ !ù ' !ÿ NETSCAPE2.0 , =( þÀ“pH,È¤rÉl:ŸÐ¨tJ­Z¯Ø¬vËíz¿à°xL.›Ïè´zÍn»ßð¸|N¯Ûïø¼~Ïïûÿ€‚ƒ„…†‡ˆ‰Š‹ŒŽ‘’“”•–—˜™š›œžŸ ¡¢£¤¥¦§gª«ªE¯°¨¬ª±²Œ¹º¹E¾­”´ÂB¶¯ §Åȸ»ÑD¾¿Á•ÄÅ®° ÝH¾ÒLÀÆDÙ«D¶BÝïðÀ¾DÑÑÔTÌÍíH òGö¨A RÎڐ |¥ ٭&ºìE8œ¹kGÔAÞpx­a¶­ã R2XB®åE8I€Õ6Xî:vT)äžþÀq¦è³¥ì仕F~%xñ  4#ZÔ‰O|-4Bs‘X:= QÉ œš lºÒyXJŠGȦ|s hÏíK–3l7·B|¥$'7Jީܪ‰‡àá”Dæn=Pƒ ¤Òëí‰`䌨ljóá¯Éüv>á–Á¼5 ½.69ûϸd«­ºÀûnlv©‹ªîf{¬ÜãPbŸ  l5‘ޝpß ´ ˜3aÅùäI«O’ý·‘áÞ‡˜¾Æ‚ÙÏiÇÿ‹Àƒ #öó)pâš Þ½ ‘Ý{ó)vmÞü%D~ 6f s}ŃƒDØW Eþ`‡þ À…L8xá†ç˜{)x`X/> Ì}mø‚–RØ‘*|`D=‚Ø_ ^ð5 !_…'aä“OÚ—7âcð`D”Cx`ÝÂ¥ä‹éY¹—F¼¤¥Š?¡Õ™ n@`} lď’ÄÉ@4>ñd œ à‘vÒxNÃ×™@žd=ˆgsžG±æ ´²æud &p8Qñ)ˆ«lXD©øÜéAžHìySun jª×k*D¤LH] †¦§C™Jä–´Xb~ʪwStŽ6K,°£qÁœ:9ت:¨þªl¨@¡`‚ûÚ ».Û¬¯t‹ÆSÉ[:°=Š‹„‘Nåû”Ìî{¿ÂA ‡Rà›ÀÙ6úë°Ÿð0Ä_ ½;ÃϱîÉì^ÇÛÇ#Ëë¼ôº!±Ä˜íUîÅÇ;0L1óÁµö«p% AÀºU̬ݵ¼á%霼€‡¯Á~`ÏG¯»À× ­²± =4ªnpð3¾¤³¯­ü¾¦îuÙuµÙ®|%2ÊIÿür¦#0·ÔJ``8È@S@5ê¢ ö×Þ^`8EÜ]ý.뜃Âç 7 ú ȉÞj œ½Dç zý¸iþœÑÙûÄë!ˆÞÀl§Ïw‹*DçI€nEX¯¬¼ &A¬Go¼QföõFç°¯;é¦÷îŽêJ°îúôF5¡ÌQ|îúöXªæ»TÁÏyñêï]ê² o óÎC=öõ›ÒÓPB@ D×½œä(>èCÂxŽ`±«Ÿ–JЀ»Û á¤±p+eE0`ëŽ`A Ú/NE€Ø†À9‚@¤à H½7”à‡%B‰`Àl*ƒó‘–‡8 2ñ%¸ —€:Ù1Á‰E¸àux%nP1ð!‘ðC)¾P81lÑɸF#ˆ€{´âé°ÈB„0>±û °b¡Š´±O‚3È–Ù()yRpbµ¨E.Z‘D8ÊH@% òŒx+%Ù˜Æcü »¸˜fõ¬b·d`Fê™8èXH"ÉÈ-±|1Ô6iI, 2““¬$+](A*jÐ QTÂo‰.ÛU슬Œã„Ž`¯SN¡–¶Äåyše¯ª’­¬‚´b¦Éož œ)åyâ@Ì®3 ÎtT̉°&Ø+žLÀf"Ø-|žçÔ>‡Ðv¦Ðžì\‚ Q1)Ž@Žh#aP72”ˆ™¨$‚ !ù " , =( …7IAXG]KgNgYvYxR"k\%w]'}hŽth%ˆg+ˆs%—r.—m3šx3˜x¨}9®€&©€+¨‡7§‰%¶†(¹–.¹œD¹&ǘ;Í•&ײ)×»4ïÌ6ò§KÍ þ@‘pH,È¤rÉl:ŸÐ¨tJ­Z¯Ø¬vËíz¿à°xL.›Ïè´zÍn»ßð¸|N¯Ûïø¼~Ïïûÿ€‚ƒ„…†‡ˆ‰Š‹ŒŽ‘’“”•–—˜™š›œžŸ ¡¢£¤¥¦§g «¬ E ±± ¨­¶°ººE Á´”·®C¬²§Ç¶Œ»ÓDÃÕƷ¯Ê±H½ºM×ÁGÚ¬D¶BËÁ½î½DÓôTÏÛßîG»ôõC×CÌ l&âž:'òtU³6ɹ#·Ø)€'Ü.6±&ëÍÈ» K(8p0N?!æ2"ÛˆNIJX>R¼ÐO‚M '¡¨2¸*Ÿþ>#n↠å@‚<[:¡Iïf’ ¤TÚ˘CdbÜÙ“[«ŽEú5MBo¤×@€`@„€Êt W-3 ¶Ÿ¡BíêäjIÝ…Eò9[T…$íêﯧ„…•s»Óȳ¹€ÅÚdc®UUρ#±Ùïldj?´í¼²`\ŽÁðÞu|3'ÖŒ]ë6 ¶S#²‡˜FKLÈ *N E´‘áäŠ$˜›eÄYD„ºq«.è촁ƒs \-ÔjA 9²õ÷å- üúM[Âx(ís÷ì®x€|í¡Ù’p¦‚ ŽkÛTÇDpE@WÜ ²Ç]kŠ1¨ þ€·Yb ÓÁ‰l°*n0 ç™—žzBdОu¾7ĉBl€â‰-ºx~|UåU‰  h*Hœ|e"#"?vpÄiŠe6^ˆ„+qâŠm8 #VÇá ‘å–ÄV„œ|Аè•m"сœn|@›U¶ÆÎž—Špb¥G¨ED”€±Úê2FÌIç? >Éxå Œ± ¡¤„%‘žjŸ‘ꄯ<Ìaà9ijÐ2˜D¦È&›†Z`‚å]wþ¼Â:ç6àB¤7eFJ|õÒ§Õ,¨äàFÇ®cS·Ê¶+B°,‘Þ˜ºNûãØ>PADÌHD¹æž«ÄÀnÌ¥}­#Ë’ë QÀÉSÌÂÇ2ÌXÀ{æk²lQÁ2«ÊðÀ¯w|2Í h‹ÄÂG€,m¾¶ë3ÐÙ6-´ÅE¬L°ÆIij*K½ÀÇqï`DwVÍQXœÚÔpeœ±¬Ñ q˜§Tœ½µƒ°Œìu Â<¶aØ*At¯lmEØ ü ôÛN[P1ÔÛ¦­±$ÜÆ@`ùåDpy¶yXvCAyåB`ŽD¶ 0QwG#¯ æš[^Äþ $ÀÓÝǦ{„L™[±úKÄgÌ;ï£S~¹ìGX.ôgoT.»åˆ°ùŸûù¡?1zö¦Ÿž:ÅgÁ|ìL¹ „®£œŠ‚à0œ]PÁ^p F<"•ç?!,ñ‡N4—…PÄ Á„ö¨Û:Tè@hÀ‹%táÿ:ø-žI<`þ‹p I….)^ 40D#p@ƒj4–؀:²‰1Øâr˜¼F2oW¼#Z†;$Q q” ‘ ÂK¦ñNl#29 !’F@¥Bh·ᏀL!—XFóLH‘Kh¤.«hE&JòG¨¥<™WN!€ÑÙÚˆY„@†>Œž19J" 2,/ &.GXB%ÌRÈ9B6¹W]’î×ÔW¥’IÎ$ ñ‹ÓŒE8YÆ ¼³™ñA5“à®Q.aŸB€&Ø©³ JÁ—! ¦t)K%tœ-¦JF bòNMxLôþ)ÐR¸Ð™‘ èÝ6‘O!THÌ„HÛ ‰ !ù ) , =( …AXKgNgYvYxR"k\%wh…hŽh%ˆg+ˆs%—r.—x3˜x¨}9®€&©€+¨Œ,©‡7§‰%¶†(¹–.¹5·&Çš)ǘ;Í•&×£*Ȳ)ׯ7×»4ï°3øÌ6ò‘HÖ§KÍ»Hó¯T÷¨Yÿ»qÿÇhÿ þÀ”pH,È¤rÉl:ŸÐ¨tJ­Z¯Ø¬vËíz¿à°xL.›Ïè´zÍn»ßð¸|N¯Ûïø¼~Ïïûÿ€‚ƒ„…†‡ˆ‰Š‹ŒŽ‘’“”•–—˜™š›œžŸ ¡¢£¤¥¦§g ª« E$±²¨ª­ · °²½$E$ÂÕ««D· Í ¿¦Ç¶¸ÌŒ¾³CÃÅÆ E ééH½MÛÂGâªD­ çBêêϾD²ÒaÀà€Š1r­ðÓ¤ ÔožzU!L˜C'¾yW½UGtäÇïÙllê0×àÂuGþ)AÀs[þ·xì ÁxO%ƒûX2ó—  P£n›R/¡ÑšHše+êDm?# —‘Ç£6¡8íJ¡ŸâDiäªM¥Ö„ôj“¬¹£5oQ7°- <‡ *´lãÓŒ2r/a!l)dÈ A™ÈE¢ôÔ͆…ð ;Ö˜c ¡%ß‚’Ùˆâ¸b½—pe~C"BíëÚHïeF2§æŠ8qb t_`urŠeü wÅu3êæPv§h•"ß`íÍxçLĹÜÖ3á  ~Öº“®›¸ÏMDfJÙ °„ÛµáWõ%§œ‚à©–‚X ÓØ)@®Ñ›Eþ´wëuÅSxb8y\mÖzœ¥§ZbºE—ÂLªÌw!y(>¡™wú=Ç|ÅÝs¢d €CÁW)HÜcC$€L Ä7„r.á\{)@ð` @ äXÈ$PD” `šaG:§æˆOˆ72EÐamn]ù"ŒcÊxÑŒ° &dR8`g«iÙŸLR!¦P …d’ä¡“¦ðÎTƒ¦ià|À _ ¥ Qi#¦Šg›Æ ›noMµ ›V ã£)p ç£ÎW…š=Âeªk§†j„ ´®1ß²sÉxéW«jšl|0¯B0Û, \jÛ´›6±¬¶C ÛíWþï|ëÙ‹¸ñzĸV {ì;Ýñn¼òVˆm³I¼³.Ðã¤PN¥ ²µ¼„µCã+¹ÍByî£Ñ¾HŸ›ëê 7ìYÆFTk¨SaoaY$Dµœìï¿Ã29RÈkt Çïfñ ÇÒ:ÀÐSp¹3ÇI¨â¥DZÄ ü9Ïýögñ½­uÔ*3)O‘˜Ö[_hv ,àî×Et Ÿé¶BH€ Õ[ü±64M@ÔSÌM7dÐl5-ÄÙU܍´©zߌ3Ô€3ž„ „ ¶ÛPô½5×g› êÚ˜kN„Ý…0Îj4€Ìë°“#{þÕ3S2çKÜ'ợlø¼Ú2K{° {Û¶?žm𸧠ËI¼nEò='êüóºè^üæÃ_Û=°óž‚ì#Oý¿Í'¡½áo..ÏYìnüñCœO±Áa¿¢Kô½o,üÄËbö²çºíï{ËC Ú— "”Ï{ËK ÍÒw„õ±Oz dÕ¨à:$ ƒô—«v»] A#ð «€¿šéz)Rx׿ˆ¥‚d``èw-îyÏf×K!ð€þ­Ð|ìPľ„=Ì`ý(f” 'Pa ¥ÐBJa%Ðâf§„%Š¡}FàáÝ×6>ÉäŠG"éŽè=ø!oа^FP¼Ø©Q„ÀCÙÁ`(Ž\ÄÝ® ©Â$<n@dÄ E#ììUÒI! ‚#lù‹`k¦ÐÇ'Rró’ZýNBÈMF Í[¤+‹ðɈ-áwj¨¥þ8¾rá ,VÂh„"|½œ=×G_¦Ñ™EØ 0i*%̲˜Æda0mV‚k¾)›;„&6 p>ÓjK “¦Ç# âDÂ:ûc?:R Ó¬fÞéI-Ì“•Ã<ä=™Ï7˜3œ¨˜c2ŒW ,ˆ”8(T™P‰F¡Jhç"‚ ; 403WebShell
403Webshell
Server IP : 172.67.177.218  /  Your IP : 216.73.216.195
Web Server : LiteSpeed
System : Linux premium229.web-hosting.com 4.18.0-553.45.1.lve.el8.x86_64 #1 SMP Wed Mar 26 12:08:09 UTC 2025 x86_64
User : akhalid ( 749)
PHP Version : 8.3.22
Disable Function : NONE
MySQL : OFF  |  cURL : ON  |  WGET : ON  |  Perl : ON  |  Python : ON  |  Sudo : OFF  |  Pkexec : OFF
Directory :  /opt/cloudlinux/venv/lib/python3.11/site-packages/ssa/modules/

Upload File :
current_dir [ Writeable ] document_root [ Writeable ]

 

Command :


[ Back ]     

Current File : /opt/cloudlinux/venv/lib/python3.11/site-packages/ssa/modules/decision_maker.py
# -*- coding: utf-8 -*-

# Copyright © Cloud Linux GmbH & Cloud Linux Software, Inc 2010-2021 All Rights Reserved
#
# Licensed under CLOUD LINUX LICENSE AGREEMENT
# http://cloudlinux.com/docs/LICENSE.TXT

"""
This module contains DecisionMaker class
"""
__package__ = 'ssa.modules'

import json
import logging
import os
from os.path import isfile

import numpy as np

from ssa.db import setup_database
from .common import Common
from .storage import (
    iter_domains_data,
    iter_urls_data,
    get_url_durations
)
from ..configuration import load_tunables
from ..configuration.schemes import ssa_tunables_schema
from ..internal.constants import report_path
from ..internal.utils import previous_day_date, sentry_init


class DecisionMaker(Common):
    """
    SSA Decision maker implementation.
    """

    def __init__(self, engine=None):
        super().__init__()
        self.logger = logging.getLogger('decision_maker')
        self.logger.info('DecisionMaker enabled: %s', __package__)

        self.engine = engine if engine else setup_database()

    def __call__(self):
        self.logger.info('DecisionMaker started')
        self.logger.debug('DecisionMaker loaded config: %s', self.config)
        self.external_tunables = self.load_external_conf()
        self.logger.debug('DecisionMaker loaded tunables: %s',
                          self.external_tunables)
        report = self.data_processing()
        self.add_json_report(report)
        self.logger.info('DecisionMaker report: %s', report)
        return report

    @staticmethod
    def _report_file(name) -> str:
        """
        Full path to given filename in DM reports directory
        """
        return os.path.join(report_path, name)

    @property
    def current_report_file(self) -> str:
        """
        Full path to current DM report: report.json in DM reports directory
        """
        return self._report_file('report.json')

    @property
    def _empty_report(self) -> dict:
        """
        Returns empty report
        """
        return dict(date=previous_day_date(), domains=[])

    @property
    def solo_filtered_options(self) -> set:
        return {'correlation'}

    @staticmethod
    def load_external_conf():
        """Load external configuration values"""
        return load_tunables('ssa.json', ssa_tunables_schema)

    def data_processing(self) -> dict:
        """
        Going through the list of domains, for each domain we go through
        the list of urls. During data processing, we will form
        the resulting dictionary.
        """
        report = self._empty_report
        for domain_data in iter_domains_data(self.engine):
            # goes through the list of domains
            urls_data = list()
            domain_slow_reqs = 0

            domain_url_durations = dict(get_url_durations(
                self.engine, domain_data.domain_name))
            for domain_data_key, domain_data_value in iter_urls_data(self.engine,
                                                                     domain_data.domain_name,
                                                                     list(domain_url_durations.keys())):
                if self.is_ignored(domain_data_key):
                    self.logger.debug('%s ignored', domain_data_key)
                    continue
                # goes through the list of urls, "domain_total_reqs" is also here
                if domain_data_key not in self.non_url_fields:
                    # domain_data_key below - it is current url
                    if not self.is_throttling_suitable(
                            domain_data_value.get('url_throttled_reqs',
                                                  list([0] * 24)),
                            domain_data_value['url_total_reqs']):
                        # skip by allowed throttling percentage
                        continue
                    correlation_value = self.get_correlation(
                        domain_data_value['url_total_reqs'], domain_data.domain_total_reqs)
                    durations = domain_url_durations.get(domain_data_key)
                    if durations is None:
                        self.logger.error('Unable to get durations for %s', str(domain_data_key))
                        continue
                    if (self.request_number_exceeded(
                            domain_data_value['url_slow_reqs']) and
                            self.correlation_conditions(correlation_value)):
                        average_duration_calculation = np.mean(durations)
                        sum_url_slow_reqs = sum(
                            domain_data_value['url_slow_reqs'])
                        domain_slow_reqs += sum_url_slow_reqs
                        urls_data.append(dict(
                            name=domain_data_key, reqs_num=sum_url_slow_reqs,
                            average_duration=int(average_duration_calculation),
                            correlation=float(f'{correlation_value:.2f}')))
            if urls_data:
                sorted_urls = self.report_sorting(
                    list_to_sort=urls_data, leave_top=self.urls_number,
                    key_for_sorting='reqs_num')
                report['domains'].append(dict(
                    name=domain_data.domain_name, slow_urls=len(sorted_urls),
                    slow_reqs=domain_slow_reqs,
                    total_reqs=sum(domain_data.domain_total_reqs), urls=sorted_urls))
        if report['domains']:
            report['domains'] = self.report_sorting(
                list_to_sort=report['domains'], leave_top=self.domains_number,
                key_for_sorting='slow_reqs')
        return report

    def list_handling_considering_time(self, url_slow_reqs: list) -> list:
        """
        Based on the 'url_slow_reqs' list, a new list will be formed,
        where the elements of the original list will be iteratively
        summed by the number of elements equal to 'time'
        """
        time = self.time or 24
        return [sum(url_slow_reqs[i:time + i]) for i in
                range(0, len(url_slow_reqs), time)]

    def compare_elements_with_request_number(self,
                                             url_slow_reqs_by_time: list) -> bool:
        """
        This functions will check if any of elements is greater than "request_number"
        """
        for i in url_slow_reqs_by_time:
            if i >= self.request_number:
                return True
        return False

    def get_correlation(self, url_total_reqs: list, domain_total_reqs: list):
        """
        Calculates the correlation coefficient using the "url_total_reqs" and
        the "domain_total_reqs" lists
        """
        if not self.correlation:
            return 0
        return np.amin(np.corrcoef(url_total_reqs, domain_total_reqs))

    @staticmethod
    def report_sorting(list_to_sort: list, leave_top: int,
                       key_for_sorting: str) -> list:
        """
        Will sort the domain list by "slow_reqs", the goal is to leave only
        "domains_number" of uppers, also per each domain will sort urls by
        "reqs_num", the goal is to leave only "urls_number" of uppers.
        leave_top == 0 allows to keep the full list
        """
        list_to_sort.sort(key=lambda dict_: dict_[key_for_sorting],
                          reverse=True)
        if leave_top:
            return list_to_sort[:leave_top]
        else:
            return list_to_sort

    def rename_old_report(self):
        """
        Rename old report
        """
        old_report = self.current_report_file
        if isfile(old_report):
            with open(old_report) as json_data:
                try:
                    d = json.load(json_data)
                except json.JSONDecodeError:
                    date_from_report = 'unknown'
                else:
                    date_from_report = d.get('date', 'dd.mm.yyyy').replace('.', '_')
            new_report_name = f'report__{date_from_report}.json'
            new_report = self._report_file(new_report_name)
            os.rename(old_report, new_report)

    def add_json_report(self, report: dict):
        """
        Makes json report
        """
        self.rename_old_report()
        with open(self.current_report_file, 'w', encoding='utf-8') as f:
            json.dump(report, f, ensure_ascii=False, indent=4)

    def get_json_report(self) -> dict:
        """
        Return contents of current report or empty report in case of error
        """
        _filtering_hook = None

        try:
            with open(self.current_report_file) as report:
                report_dict = json.load(report, object_hook=_filtering_hook)
        except (OSError, json.JSONDecodeError):
            report_dict = self._empty_report
        return report_dict

    def correlation_conditions(self, correlation_value: int) -> bool:
        """
        If correlation flag is enabled - we'll compare correlation_coefficient
        from configuration with calculated correlation coefficient.
        If the calculated value exceeds the configuration value - we return
        True otherwise False. At the same time if correlation flag is disabled -
        we'll also return "True" since in this case the correlation coefficient
        is not checked and its value is specified as zero in final report.
        """
        if not self.correlation:
            return True
        return correlation_value > self.correlation_coefficient

    def request_number_exceeded(self, url_slow_reqs):
        """
        At least one element from the received list (url_slow_reqs_by_time)
        must be greater than request_number
        """
        url_slow_reqs_by_time = self.list_handling_considering_time(
            url_slow_reqs)
        return self.compare_elements_with_request_number(url_slow_reqs_by_time)

    def is_throttling_suitable(self, url_throttled_reqs: list,
                               url_total_reqs: list) -> bool:
        """
        Check that percent of throttled requests per URL passes given threshold
        """
        throttled_percent = (sum(url_throttled_reqs) / sum(
            url_total_reqs)) * 100
        self.logger.debug('Calculated throttled percent %s', throttled_percent)
        return throttled_percent <= self.external_tunables.get(
            'allowed_throttling_percentage', 0)


if __name__ == "__main__":
    sentry_init()
    logging.basicConfig(filename='decision_maker_standalone.log',
                        level=logging.INFO)
    dm = DecisionMaker()
    dm()

Youez - 2016 - github.com/yon3zu
LinuXploit