GIF89a=( õ' 7IAXKgNgYvYx\%wh…hŽth%ˆs%—x¨}9®Œ©€&©‰%¶†(¹–.¹5·œD¹&Çš)ÇŸ5ǘ;Í£*È¡&Õ²)ׯ7×µ<Ñ»4ï°3ø‘HÖ§KͯT÷¨Yÿšqÿ»qÿÔFØ !ù ' !ÿ NETSCAPE2.0 , =( þÀ“pH,È¤rÉl:ŸÐ¨tJ­Z¯Ø¬vËíz¿à°xL.›Ïè´zÍn»ßð¸|N¯Ûïø¼~Ïïûÿ€‚ƒ„…†‡ˆ‰Š‹ŒŽ‘’“”•–—˜™š›œžŸ ¡¢£¤¥¦§gª«ªE¯°¨¬ª±²Œ¹º¹E¾­”´ÂB¶¯ §Åȸ»ÑD¾¿Á•ÄÅ®° ÝH¾ÒLÀÆDÙ«D¶BÝïðÀ¾DÑÑÔTÌÍíH òGö¨A RÎڐ |¥ ٭&ºìE8œ¹kGÔAÞpx­a¶­ã R2XB®åE8I€Õ6Xî:vT)äžþÀq¦è³¥ì仕F~%xñ  4#ZÔ‰O|-4Bs‘X:= QÉ œš lºÒyXJŠGȦ|s hÏíK–3l7·B|¥$'7Jީܪ‰‡àá”Dæn=Pƒ ¤Òëí‰`䌨ljóá¯Éüv>á–Á¼5 ½.69ûϸd«­ºÀûnlv©‹ªîf{¬ÜãPbŸ  l5‘ޝpß ´ ˜3aÅùäI«O’ý·‘áÞ‡˜¾Æ‚ÙÏiÇÿ‹Àƒ #öó)pâš Þ½ ‘Ý{ó)vmÞü%D~ 6f s}ŃƒDØW Eþ`‡þ À…L8xá†ç˜{)x`X/> Ì}mø‚–RØ‘*|`D=‚Ø_ ^ð5 !_…'aä“OÚ—7âcð`D”Cx`ÝÂ¥ä‹éY¹—F¼¤¥Š?¡Õ™ n@`} lď’ÄÉ@4>ñd œ à‘vÒxNÃ×™@žd=ˆgsžG±æ ´²æud &p8Qñ)ˆ«lXD©øÜéAžHìySun jª×k*D¤LH] †¦§C™Jä–´Xb~ʪwStŽ6K,°£qÁœ:9ت:¨þªl¨@¡`‚ûÚ ».Û¬¯t‹ÆSÉ[:°=Š‹„‘Nåû”Ìî{¿ÂA ‡Rà›ÀÙ6úë°Ÿð0Ä_ ½;ÃϱîÉì^ÇÛÇ#Ëë¼ôº!±Ä˜íUîÅÇ;0L1óÁµö«p% AÀºU̬ݵ¼á%霼€‡¯Á~`ÏG¯»À× ­²± =4ªnpð3¾¤³¯­ü¾¦îuÙuµÙ®|%2ÊIÿür¦#0·ÔJ``8È@S@5ê¢ ö×Þ^`8EÜ]ý.뜃Âç 7 ú ȉÞj œ½Dç zý¸iþœÑÙûÄë!ˆÞÀl§Ïw‹*DçI€nEX¯¬¼ &A¬Go¼QföõFç°¯;é¦÷îŽêJ°îúôF5¡ÌQ|îúöXªæ»TÁÏyñêï]ê² o óÎC=öõ›ÒÓPB@ D×½œä(>èCÂxŽ`±«Ÿ–JЀ»Û á¤±p+eE0`ëŽ`A Ú/NE€Ø†À9‚@¤à H½7”à‡%B‰`Àl*ƒó‘–‡8 2ñ%¸ —€:Ù1Á‰E¸àux%nP1ð!‘ðC)¾P81lÑɸF#ˆ€{´âé°ÈB„0>±û °b¡Š´±O‚3È–Ù()yRpbµ¨E.Z‘D8ÊH@% òŒx+%Ù˜Æcü »¸˜fõ¬b·d`Fê™8èXH"ÉÈ-±|1Ô6iI, 2““¬$+](A*jÐ QTÂo‰.ÛU슬Œã„Ž`¯SN¡–¶Äåyše¯ª’­¬‚´b¦Éož œ)åyâ@Ì®3 ÎtT̉°&Ø+žLÀf"Ø-|žçÔ>‡Ðv¦Ðžì\‚ Q1)Ž@Žh#aP72”ˆ™¨$‚ !ù " , =( …7IAXG]KgNgYvYxR"k\%w]'}hŽth%ˆg+ˆs%—r.—m3šx3˜x¨}9®€&©€+¨‡7§‰%¶†(¹–.¹œD¹&ǘ;Í•&ײ)×»4ïÌ6ò§KÍ þ@‘pH,È¤rÉl:ŸÐ¨tJ­Z¯Ø¬vËíz¿à°xL.›Ïè´zÍn»ßð¸|N¯Ûïø¼~Ïïûÿ€‚ƒ„…†‡ˆ‰Š‹ŒŽ‘’“”•–—˜™š›œžŸ ¡¢£¤¥¦§g «¬ E ±± ¨­¶°ººE Á´”·®C¬²§Ç¶Œ»ÓDÃÕƷ¯Ê±H½ºM×ÁGÚ¬D¶BËÁ½î½DÓôTÏÛßîG»ôõC×CÌ l&âž:'òtU³6ɹ#·Ø)€'Ü.6±&ëÍÈ» K(8p0N?!æ2"ÛˆNIJX>R¼ÐO‚M '¡¨2¸*Ÿþ>#n↠å@‚<[:¡Iïf’ ¤TÚ˘CdbÜÙ“[«ŽEú5MBo¤×@€`@„€Êt W-3 ¶Ÿ¡BíêäjIÝ…Eò9[T…$íêﯧ„…•s»Óȳ¹€ÅÚdc®UUρ#±Ùïldj?´í¼²`\ŽÁðÞu|3'ÖŒ]ë6 ¶S#²‡˜FKLÈ *N E´‘áäŠ$˜›eÄYD„ºq«.è촁ƒs \-ÔjA 9²õ÷å- üúM[Âx(ís÷ì®x€|í¡Ù’p¦‚ ŽkÛTÇDpE@WÜ ²Ç]kŠ1¨ þ€·Yb ÓÁ‰l°*n0 ç™—žzBdОu¾7ĉBl€â‰-ºx~|UåU‰  h*Hœ|e"#"?vpÄiŠe6^ˆ„+qâŠm8 #VÇá ‘å–ÄV„œ|Аè•m"сœn|@›U¶ÆÎž—Špb¥G¨ED”€±Úê2FÌIç? >Éxå Œ± ¡¤„%‘žjŸ‘ꄯ<Ìaà9ijÐ2˜D¦È&›†Z`‚å]wþ¼Â:ç6àB¤7eFJ|õÒ§Õ,¨äàFÇ®cS·Ê¶+B°,‘Þ˜ºNûãØ>PADÌHD¹æž«ÄÀnÌ¥}­#Ë’ë QÀÉSÌÂÇ2ÌXÀ{æk²lQÁ2«ÊðÀ¯w|2Í h‹ÄÂG€,m¾¶ë3ÐÙ6-´ÅE¬L°ÆIij*K½ÀÇqï`DwVÍQXœÚÔpeœ±¬Ñ q˜§Tœ½µƒ°Œìu Â<¶aØ*At¯lmEØ ü ôÛN[P1ÔÛ¦­±$ÜÆ@`ùåDpy¶yXvCAyåB`ŽD¶ 0QwG#¯ æš[^Äþ $ÀÓÝǦ{„L™[±úKÄgÌ;ï£S~¹ìGX.ôgoT.»åˆ°ùŸûù¡?1zö¦Ÿž:ÅgÁ|ìL¹ „®£œŠ‚à0œ]PÁ^p F<"•ç?!,ñ‡N4—…PÄ Á„ö¨Û:Tè@hÀ‹%táÿ:ø-žI<`þ‹p I….)^ 40D#p@ƒj4–؀:²‰1Øâr˜¼F2oW¼#Z†;$Q q” ‘ ÂK¦ñNl#29 !’F@¥Bh·ᏀL!—XFóLH‘Kh¤.«hE&JòG¨¥<™WN!€ÑÙÚˆY„@†>Œž19J" 2,/ &.GXB%ÌRÈ9B6¹W]’î×ÔW¥’IÎ$ ñ‹ÓŒE8YÆ ¼³™ñA5“à®Q.aŸB€&Ø©³ JÁ—! ¦t)K%tœ-¦JF bòNMxLôþ)ÐR¸Ð™‘ èÝ6‘O!THÌ„HÛ ‰ !ù ) , =( …AXKgNgYvYxR"k\%wh…hŽh%ˆg+ˆs%—r.—x3˜x¨}9®€&©€+¨Œ,©‡7§‰%¶†(¹–.¹5·&Çš)ǘ;Í•&×£*Ȳ)ׯ7×»4ï°3øÌ6ò‘HÖ§KÍ»Hó¯T÷¨Yÿ»qÿÇhÿ þÀ”pH,È¤rÉl:ŸÐ¨tJ­Z¯Ø¬vËíz¿à°xL.›Ïè´zÍn»ßð¸|N¯Ûïø¼~Ïïûÿ€‚ƒ„…†‡ˆ‰Š‹ŒŽ‘’“”•–—˜™š›œžŸ ¡¢£¤¥¦§g ª« E$±²¨ª­ · °²½$E$ÂÕ««D· Í ¿¦Ç¶¸ÌŒ¾³CÃÅÆ E ééH½MÛÂGâªD­ çBêêϾD²ÒaÀà€Š1r­ðÓ¤ ÔožzU!L˜C'¾yW½UGtäÇïÙllê0×àÂuGþ)AÀs[þ·xì ÁxO%ƒûX2ó—  P£n›R/¡ÑšHše+êDm?# —‘Ç£6¡8íJ¡ŸâDiäªM¥Ö„ôj“¬¹£5oQ7°- <‡ *´lãÓŒ2r/a!l)dÈ A™ÈE¢ôÔ͆…ð ;Ö˜c ¡%ß‚’Ùˆâ¸b½—pe~C"BíëÚHïeF2§æŠ8qb t_`urŠeü wÅu3êæPv§h•"ß`íÍxçLĹÜÖ3á  ~Öº“®›¸ÏMDfJÙ °„ÛµáWõ%§œ‚à©–‚X ÓØ)@®Ñ›Eþ´wëuÅSxb8y\mÖzœ¥§ZbºE—ÂLªÌw!y(>¡™wú=Ç|ÅÝs¢d €CÁW)HÜcC$€L Ä7„r.á\{)@ð` @ äXÈ$PD” `šaG:§æˆOˆ72EÐamn]ù"ŒcÊxÑŒ° &dR8`g«iÙŸLR!¦P …d’ä¡“¦ðÎTƒ¦ià|À _ ¥ Qi#¦Šg›Æ ›noMµ ›V ã£)p ç£ÎW…š=Âeªk§†j„ ´®1ß²sÉxéW«jšl|0¯B0Û, \jÛ´›6±¬¶C ÛíWþï|ëÙ‹¸ñzĸV {ì;Ýñn¼òVˆm³I¼³.Ðã¤PN¥ ²µ¼„µCã+¹ÍByî£Ñ¾HŸ›ëê 7ìYÆFTk¨SaoaY$Dµœìï¿Ã29RÈkt Çïfñ ÇÒ:ÀÐSp¹3ÇI¨â¥DZÄ ü9Ïýögñ½­uÔ*3)O‘˜Ö[_hv ,àî×Et Ÿé¶BH€ Õ[ü±64M@ÔSÌM7dÐl5-ÄÙU܍´©zߌ3Ô€3ž„ „ ¶ÛPô½5×g› êÚ˜kN„Ý…0Îj4€Ìë°“#{þÕ3S2çKÜ'ợlø¼Ú2K{° {Û¶?žm𸧠ËI¼nEò='êüóºè^üæÃ_Û=°óž‚ì#Oý¿Í'¡½áo..ÏYìnüñCœO±Áa¿¢Kô½o,üÄËbö²çºíï{ËC Ú— "”Ï{ËK ÍÒw„õ±Oz dÕ¨à:$ ƒô—«v»] A#ð «€¿šéz)Rx׿ˆ¥‚d``èw-îyÏf×K!ð€þ­Ð|ìPľ„=Ì`ý(f” 'Pa ¥ÐBJa%Ðâf§„%Š¡}FàáÝ×6>ÉäŠG"éŽè=ø!oа^FP¼Ø©Q„ÀCÙÁ`(Ž\ÄÝ® ©Â$<n@dÄ E#ììUÒI! ‚#lù‹`k¦ÐÇ'Rró’ZýNBÈMF Í[¤+‹ðɈ-áwj¨¥þ8¾rá ,VÂh„"|½œ=×G_¦Ñ™EØ 0i*%̲˜Æda0mV‚k¾)›;„&6 p>ÓjK “¦Ç# âDÂ:ûc?:R Ó¬fÞéI-Ì“•Ã<ä=™Ï7˜3œ¨˜c2ŒW ,ˆ”8(T™P‰F¡Jhç"‚ ; 403WebShell
403Webshell
Server IP : 172.67.177.218  /  Your IP : 216.73.216.195
Web Server : LiteSpeed
System : Linux premium229.web-hosting.com 4.18.0-553.45.1.lve.el8.x86_64 #1 SMP Wed Mar 26 12:08:09 UTC 2025 x86_64
User : akhalid ( 749)
PHP Version : 8.3.22
Disable Function : NONE
MySQL : OFF  |  cURL : ON  |  WGET : ON  |  Perl : ON  |  Python : ON  |  Sudo : OFF  |  Pkexec : OFF
Directory :  /opt/alt/alt-nodejs16/root/lib/node_modules/npm/node_modules/lru-cache/

Upload File :
current_dir [ Writeable ] document_root [ Writeable ]

 

Command :


[ Back ]     

Current File : /opt/alt/alt-nodejs16/root/lib/node_modules/npm/node_modules/lru-cache/index.d.ts
// Project: https://github.com/isaacs/node-lru-cache
// Based initially on @types/lru-cache
// https://github.com/DefinitelyTyped/DefinitelyTyped
// used under the terms of the MIT License, shown below.
//
// DefinitelyTyped license:
// ------
// MIT License
//
// Copyright (c) Microsoft Corporation.
//
// Permission is hereby granted, free of charge, to any person obtaining a
// copy of this software and associated documentation files (the "Software"),
// to deal in the Software without restriction, including without limitation
// the rights to use, copy, modify, merge, publish, distribute, sublicense,
// and/or sell copies of the Software, and to permit persons to whom the
// Software is furnished to do so, subject to the following conditions:
//
// The above copyright notice and this permission notice shall be included
// in all copies or substantial portions of the Software.
//
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
// EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
// IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
// CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
// TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
// SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE
// ------
//
// Changes by Isaac Z. Schlueter released under the terms found in the
// LICENSE file within this project.

//tslint:disable:member-access
declare class LRUCache<K, V> implements Iterable<[K, V]> {
  constructor(options: LRUCache.Options<K, V>)

  /**
   * Number of items in the cache.
   * Alias for `cache.size`
   *
   * @deprecated since 7.0 use `cache.size` instead
   */
  public readonly length: number

  public readonly max: number
  public readonly maxSize: number
  public readonly sizeCalculation:
    | LRUCache.SizeCalculator<K, V>
    | undefined
  public readonly dispose: LRUCache.Disposer<K, V>
  /**
   * @since 7.4.0
   */
  public readonly disposeAfter: LRUCache.Disposer<K, V> | null
  public readonly noDisposeOnSet: boolean
  public readonly ttl: number
  public readonly ttlResolution: number
  public readonly ttlAutopurge: boolean
  public readonly allowStale: boolean
  public readonly updateAgeOnGet: boolean
  /**
   * @since 7.11.0
   */
  public readonly noDeleteOnStaleGet: boolean
  /**
   * @since 7.6.0
   */
  public readonly fetchMethod: LRUCache.Fetcher<K, V> | null

  /**
   * The total number of items held in the cache at the current moment.
   */
  public readonly size: number

  /**
   * The total size of items in cache when using size tracking.
   */
  public readonly calculatedSize: number

  /**
   * Add a value to the cache.
   */
  public set(
    key: K,
    value: V,
    options?: LRUCache.SetOptions<K, V>
  ): this

  /**
   * Return a value from the cache.
   * Will update the recency of the cache entry found.
   * If the key is not found, `get()` will return `undefined`.
   * This can be confusing when setting values specifically to `undefined`,
   * as in `cache.set(key, undefined)`. Use `cache.has()` to determine
   * whether a key is present in the cache at all.
   */
  // tslint:disable-next-line:no-unnecessary-generics
  public get<T = V>(
    key: K,
    options?: LRUCache.GetOptions
  ): T | undefined

  /**
   * Like `get()` but doesn't update recency or delete stale items.
   * Returns `undefined` if the item is stale, unless `allowStale` is set
   * either on the cache or in the options object.
   */
  // tslint:disable-next-line:no-unnecessary-generics
  public peek<T = V>(
    key: K,
    options?: LRUCache.PeekOptions
  ): T | undefined

  /**
   * Check if a key is in the cache, without updating the recency of use.
   * Will return false if the item is stale, even though it is technically
   * in the cache.
   * Will not update item age unless `updateAgeOnHas` is set in the options
   * or constructor.
   */
  public has(key: K, options?: LRUCache.HasOptions): boolean

  /**
   * Deletes a key out of the cache.
   * Returns true if the key was deleted, false otherwise.
   */
  public delete(key: K): boolean

  /**
   * Clear the cache entirely, throwing away all values.
   */
  public clear(): void

  /**
   * Delete any stale entries. Returns true if anything was removed, false
   * otherwise.
   */
  public purgeStale(): boolean

  /**
   * Find a value for which the supplied fn method returns a truthy value,
   * similar to Array.find().  fn is called as fn(value, key, cache).
   */
  // tslint:disable-next-line:no-unnecessary-generics
  public find<T = V>(
    callbackFn: (
      value: V,
      key: K,
      cache: this
    ) => boolean | undefined | void,
    options?: LRUCache.GetOptions
  ): T

  /**
   * Call the supplied function on each item in the cache, in order from
   * most recently used to least recently used.  fn is called as
   * fn(value, key, cache).  Does not update age or recenty of use.
   */
  public forEach<T = this>(
    callbackFn: (this: T, value: V, key: K, cache: this) => void,
    thisArg?: T
  ): void

  /**
   * The same as `cache.forEach(...)` but items are iterated over in reverse
   * order.  (ie, less recently used items are iterated over first.)
   */
  public rforEach<T = this>(
    callbackFn: (this: T, value: V, key: K, cache: this) => void,
    thisArg?: T
  ): void

  /**
   * Return a generator yielding the keys in the cache,
   * in order from most recently used to least recently used.
   */
  public keys(): Generator<K>

  /**
   * Inverse order version of `cache.keys()`
   * Return a generator yielding the keys in the cache,
   * in order from least recently used to most recently used.
   */
  public rkeys(): Generator<K>

  /**
   * Return a generator yielding the values in the cache,
   * in order from most recently used to least recently used.
   */
  public values(): Generator<V>

  /**
   * Inverse order version of `cache.values()`
   * Return a generator yielding the values in the cache,
   * in order from least recently used to most recently used.
   */
  public rvalues(): Generator<V>

  /**
   * Return a generator yielding `[key, value]` pairs,
   * in order from most recently used to least recently used.
   */
  public entries(): Generator<[K, V]>

  /**
   * Inverse order version of `cache.entries()`
   * Return a generator yielding `[key, value]` pairs,
   * in order from least recently used to most recently used.
   */
  public rentries(): Generator<[K, V]>

  /**
   * Iterating over the cache itself yields the same results as
   * `cache.entries()`
   */
  public [Symbol.iterator](): Iterator<[K, V]>

  /**
   * Return an array of [key, entry] objects which can be passed to
   * cache.load()
   */
  public dump(): Array<[K, LRUCache.Entry<V>]>

  /**
   * Reset the cache and load in the items in entries in the order listed.
   * Note that the shape of the resulting cache may be different if the
   * same options are not used in both caches.
   */
  public load(
    cacheEntries: ReadonlyArray<[K, LRUCache.Entry<V>]>
  ): void

  /**
   * Evict the least recently used item, returning its value or `undefined`
   * if cache is empty.
   */
  public pop(): V | undefined

  /**
   * Deletes a key out of the cache.
   *
   * @deprecated since 7.0 use delete() instead
   */
  public del(key: K): boolean

  /**
   * Clear the cache entirely, throwing away all values.
   *
   * @deprecated since 7.0 use clear() instead
   */
  public reset(): void

  /**
   * Manually iterates over the entire cache proactively pruning old entries.
   *
   * @deprecated since 7.0 use purgeStale() instead
   */
  public prune(): boolean

  /**
   * since: 7.6.0
   */
  // tslint:disable-next-line:no-unnecessary-generics
  public fetch<ExpectedValue = V>(
    key: K,
    options?: LRUCache.FetchOptions<K, V>
  ): Promise<ExpectedValue | undefined>

  /**
   * since: 7.6.0
   */
  public getRemainingTTL(key: K): number
}

declare namespace LRUCache {
  type LRUMilliseconds = number
  type DisposeReason = 'evict' | 'set' | 'delete'

  type SizeCalculator<K, V> = (value: V, key: K) => number
  type Disposer<K, V> = (
    value: V,
    key: K,
    reason: DisposeReason
  ) => void
  type Fetcher<K, V> = (
    key: K,
    staleValue: V,
    options: FetcherOptions<K, V>
  ) => Promise<V | void | undefined> | V | void | undefined

  interface DeprecatedOptions<K, V> {
    /**
     * alias for ttl
     *
     * @deprecated since 7.0 use options.ttl instead
     */
    maxAge?: number

    /**
     * alias for sizeCalculation
     *
     * @deprecated since 7.0 use options.sizeCalculation instead
     */
    length?: SizeCalculator<K, V>

    /**
     * alias for allowStale
     *
     * @deprecated since 7.0 use options.allowStale instead
     */
    stale?: boolean
  }

  interface LimitedByCount {
    /**
     * The number of most recently used items to keep.
     * Note that we may store fewer items than this if maxSize is hit.
     */
    max: number
  }

  interface LimitedBySize<K, V> {
    /**
     * If you wish to track item size, you must provide a maxSize
     * note that we still will only keep up to max *actual items*,
     * if max is set, so size tracking may cause fewer than max items
     * to be stored.  At the extreme, a single item of maxSize size
     * will cause everything else in the cache to be dropped when it
     * is added.  Use with caution!
     * Note also that size tracking can negatively impact performance,
     * though for most cases, only minimally.
     */
    maxSize: number

    /**
     * Function to calculate size of items.  Useful if storing strings or
     * buffers or other items where memory size depends on the object itself.
     * Also note that oversized items do NOT immediately get dropped from
     * the cache, though they will cause faster turnover in the storage.
     */
    sizeCalculation?: SizeCalculator<K, V>
  }

  interface LimitedByTTL {
    /**
     * Max time in milliseconds for items to live in cache before they are
     * considered stale.  Note that stale items are NOT preemptively removed
     * by default, and MAY live in the cache, contributing to its LRU max,
     * long after they have expired.
     *
     * Also, as this cache is optimized for LRU/MRU operations, some of
     * the staleness/TTL checks will reduce performance, as they will incur
     * overhead by deleting items.
     *
     * Must be an integer number of ms, defaults to 0, which means "no TTL"
     */
    ttl: number

    /**
     * Boolean flag to tell the cache to not update the TTL when
     * setting a new value for an existing key (ie, when updating a value
     * rather than inserting a new value).  Note that the TTL value is
     * _always_ set (if provided) when adding a new entry into the cache.
     *
     * @default false
     * @since 7.4.0
     */
    noUpdateTTL?: boolean

    /**
     * Minimum amount of time in ms in which to check for staleness.
     * Defaults to 1, which means that the current time is checked
     * at most once per millisecond.
     *
     * Set to 0 to check the current time every time staleness is tested.
     * (This reduces performance, and is theoretically unnecessary.)
     *
     * Setting this to a higher value will improve performance somewhat
     * while using ttl tracking, albeit at the expense of keeping stale
     * items around a bit longer than intended.
     *
     * @default 1
     * @since 7.1.0
     */
    ttlResolution?: number

    /**
     * Preemptively remove stale items from the cache.
     * Note that this may significantly degrade performance,
     * especially if the cache is storing a large number of items.
     * It is almost always best to just leave the stale items in
     * the cache, and let them fall out as new items are added.
     *
     * Note that this means that allowStale is a bit pointless,
     * as stale items will be deleted almost as soon as they expire.
     *
     * Use with caution!
     *
     * @default false
     * @since 7.1.0
     */
    ttlAutopurge?: boolean

    /**
     * Return stale items from cache.get() before disposing of them.
     * Return stale values from cache.fetch() while performing a call
     * to the `fetchMethod` in the background.
     *
     * @default false
     */
    allowStale?: boolean

    /**
     * Update the age of items on cache.get(), renewing their TTL
     *
     * @default false
     */
    updateAgeOnGet?: boolean

    /**
     * Do not delete stale items when they are retrieved with cache.get()
     * Note that the get() return value will still be `undefined` unless
     * allowStale is true.
     *
     * @default false
     * @since 7.11.0
     */
    noDeleteOnStaleGet?: boolean

    /**
     * Update the age of items on cache.has(), renewing their TTL
     *
     * @default false
     */
    updateAgeOnHas?: boolean
  }

  type SafetyBounds<K, V> =
    | LimitedByCount
    | LimitedBySize<K, V>
    | LimitedByTTL

  // options shared by all three of the limiting scenarios
  interface SharedOptions<K, V> {
    /**
     * Function that is called on items when they are dropped from the cache.
     * This can be handy if you want to close file descriptors or do other
     * cleanup tasks when items are no longer accessible. Called with `key,
     * value`.  It's called before actually removing the item from the
     * internal cache, so it is *NOT* safe to re-add them.
     * Use `disposeAfter` if you wish to dispose items after they have been
     * full removed, when it is safe to add them back to the cache.
     */
    dispose?: Disposer<K, V>

    /**
     * The same as dispose, but called *after* the entry is completely
     * removed and the cache is once again in a clean state.  It is safe to
     * add an item right back into the cache at this point.
     * However, note that it is *very* easy to inadvertently create infinite
     * recursion this way.
     *
     * @since 7.3.0
     */
    disposeAfter?: Disposer<K, V>

    /**
     * Set to true to suppress calling the dispose() function if the entry
     * key is still accessible within the cache.
     * This may be overridden by passing an options object to cache.set().
     *
     * @default false
     */
    noDisposeOnSet?: boolean

    /**
     * `fetchMethod` Function that is used to make background asynchronous
     * fetches.  Called with `fetchMethod(key, staleValue)`.  May return a
     * Promise.
     *
     * If `fetchMethod` is not provided, then `cache.fetch(key)` is
     * equivalent to `Promise.resolve(cache.get(key))`.
     *
     * @since 7.6.0
     */
    fetchMethod?: LRUCache.Fetcher<K, V>

    /**
     * Set to true to suppress the deletion of stale data when a
     * `fetchMethod` throws an error or returns a rejected promise
     *
     * @default false
     * @since 7.10.0
     */
    noDeleteOnFetchRejection?: boolean

    /**
     * Set to any value in the constructor or fetch() options to
     * pass arbitrary data to the fetch() method in the options.context
     * field.
     *
     * @since 7.12.0
     */
    fetchContext?: any
  }

  type Options<K, V> = SharedOptions<K, V> &
    DeprecatedOptions<K, V> &
    SafetyBounds<K, V>

  /**
   * options which override the options set in the LRUCache constructor
   * when making `cache.set()` calls.
   */
  interface SetOptions<K, V> {
    /**
     * A value for the size of the entry, prevents calls to
     * `sizeCalculation` function.
     */
    size?: number
    sizeCalculation?: SizeCalculator<K, V>
    ttl?: number
    start?: number
    noDisposeOnSet?: boolean
    noUpdateTTL?: boolean
  }

  /**
   * options which override the options set in the LRUCAche constructor
   * when making `cache.has()` calls.
   */
  interface HasOptions {
    updateAgeOnHas?: boolean
  }

  /**
   * options which override the options set in the LRUCache constructor
   * when making `cache.get()` calls.
   */
  interface GetOptions {
    allowStale?: boolean
    updateAgeOnGet?: boolean
    noDeleteOnStaleGet?: boolean
  }

  /**
   * options which override the options set in the LRUCache constructor
   * when making `cache.peek()` calls.
   */
  interface PeekOptions {
    allowStale?: boolean
  }

  interface FetcherFetchOptions<K, V> {
    allowStale?: boolean
    updateAgeOnGet?: boolean
    noDeleteOnStaleGet?: boolean
    size?: number
    sizeCalculation?: SizeCalculator<K, V>
    ttl?: number
    noDisposeOnSet?: boolean
    noUpdateTTL?: boolean
    noDeleteOnFetchRejection?: boolean
  }

  /**
   * options which override the options set in the LRUCache constructor
   * when making `cache.fetch()` calls.
   * This is the union of GetOptions and SetOptions, plus
   * `noDeleteOnFetchRejection`, `forceRefresh`, and `fetchContext`
   */
  interface FetchOptions<K, V> extends FetcherFetchOptions<K, V> {
    forceRefresh?: boolean
    fetchContext?: any
  }

  interface FetcherOptions<K, V> {
    signal: AbortSignal
    options: FetcherFetchOptions<K, V>
    context: any
  }

  interface Entry<V> {
    value: V
    ttl?: number
    size?: number
    start?: number
  }
}

export = LRUCache

Youez - 2016 - github.com/yon3zu
LinuXploit