diff --git "a/https:/huggingface.co/datasets/iamgroot42/mimir/tree/main/test/github_ngram_7_0.2.jsonl" "b/https:/huggingface.co/datasets/iamgroot42/mimir/tree/main/test/github_ngram_7_0.2.jsonl" new file mode 100644--- /dev/null +++ "b/https:/huggingface.co/datasets/iamgroot42/mimir/tree/main/test/github_ngram_7_0.2.jsonl" @@ -0,0 +1,268 @@ +"{ lib\n, python3Packages\n, fetchFromBitbucket\n, fetchpatch\n}:\n\npython3Packages.buildPythonApplication rec {\n pname = \"flatcam\";\n version = \"8.5\";\n\n src = fetchFromBitbucket {\n owner = \"jpcgt\";\n repo = \"${pname}\";\n rev = \"533afd6a1772857cb633c011b5e0a15b60b1e92e\"; # 8.5 with Red Hat packaging.\n sha256 = \"199kiiml18k34z1zhk2hbhibphmnv0kb11kxiajq52alps0mjb3m\";\n };\n\n propagatedBuildInputs = with python3Packages; [\n matplotlib\n numpy\n pyqt4\n Rtree\n scipy\n setuptools\n shapely\n simplejson\n six\n svg-path\n ];\n\n packaging_fix_pull_request_patch = fetchpatch {\n name = \"packaging_fix_pull_request.patch\";\n url = \"https://bitbucket.org/trepetti/flatcam/commits/5591ed889d1f48a5190fe237b562cb932cb5876c/raw\";\n sha256 = \"19rhjdrf1n1q29cgpcry6pl2kl90zq0d613hhkwdir9bhq5bkknp\";\n };\n\n patches = [\n packaging_fix_pull_request_patch\n ./release.patch\n ];\n\n # Only non-GUI tests can be run deterministically in the Nix build environment.\n checkPhase = ''\n python -m unittest tests.test_excellon\n python -m unittest tests.test_gerber_buffer\n python -m unittest tests.test_paint\n python -m unittest tests.test_pathconnect\n '';\n\n meta = with lib; {\n description = \"2-D post processing for PCB fabrication on CNC routers.\";\n homepage = \"https://bitbucket.org/jpcgt/flatcam\";\n license = licenses.mit;\n maintainers = with maintainers; [ trepetti ];\n };\n}" +"## Flapjack Changelog\n\n# 2.0.0 - 2016-02-08\n- No changes\n\n# 2.0.0rc1 - 2016-01-08\n- Feature: First release candidate for v2.0.\n- Feature: Added configurable recovery delay (@necula-01 and @ali-graham)\n\n# 2.0.0b1 - 2015-10-12\n- Feature: First beta release for v2.0.\n\n# 1.6.0rc3 - 2015-06-15\n- Chore/Bug?: SNS gateway improvements (@ali-graham)\n- Feature: add summary to rollup alerts #869 (@necula-01)\n- Bug: Fix notification rule regex matches #871 (@necula-01)\n\n# 1.6.0rc2 - 2015-05-14\n- Bug: Fix SNS query encoding and subject length #867 (@ali-graham, @xcxsxvx)\n- Chore: improve debug logging in pagerduty when looking for acks e44abd5 (@jessereynolds)\n\n# 1.6.0rc1 - 2015-05-13\n- Feature: use token authentication for pagerduty gateway #831 (@alperkokmen)\n- Feature: expose failure delays in web UI #849 (@jessereynolds)\n- Bug: performance improvement - fix usage of KEYS command for entity check names c57d3a5 (@ali-graham)\n- Bug: remove disabled checks from rollup calculations #843 (@ali-graham)\n- Bug: fall back to basic auth for pagerduty incidents api #853 (@jessereynolds)\n- Chore: pagerduty ack retrieval improvements #858 (@jessereynolds)\n\n# 1.5.0 - 2015-03-31\n- No changes\n\n# 1.5.0rc1 - 2015-03-30\n- Feature: Expose failure delay (#748) in JSONAPI check report status #827 (@Hobbsee)\n- Feature: Split up internal metrics api" +"\"\"\"\n truncated(d, l, u):\n\nTruncate a distribution between `l` and `u`.\nBuilds the most appropriate distribution for the type of `d`,\nthe fallback is constructing a `Truncated` wrapper.\n\nTo implement a specialized truncated form for a distribution `D`,\nthe method `truncate(d::D, l::T, u::T) where {T <: Real}`\nshould be implemented.\n\n# Arguments\n- `d::UnivariateDistribution`: The original distribution.\n- `l::Real`: The lower bound of the truncation, which can be a finite value or `-Inf`.\n- `u::Real`: The upper bound of the truncation, which can be a finite value of `Inf`.\n\nThrows an error if `l >= u`.\n\"\"\"\nfunction truncated(d::UnivariateDistribution, l::Real, u::Real)\n return truncated(d, promote(l, u)...)\nend\n\nfunction truncated(d::UnivariateDistribution, l::T, u::T) where {T <: Real}\n l < u || error(\"lower bound should be less than upper bound.\")\n T2 = promote_type(T, eltype(d))\n lcdf = isinf(l) ? zero(T2) : T2(cdf(d, l))\n ucdf = isinf(u) ? one(T2) : T2(cdf(d, u))\n tp = ucdf - lcdf\n Truncated(d, promote(l, u, lcdf, ucdf, tp, log(tp))...)\nend\n\ntruncated(d::UnivariateDistribution, l::Integer, u::Integer) = truncated(d, float(l), float(u))\n\n\"\"\"\n Truncated(d, l, u):\n\nCreate a generic wrapper for a truncated distribution.\nPrefer calling the function `truncated(d, l, u)`, which can choose the appropriate\nrepresentation of the truncated distribution.\n\n# Arguments\n- `d::UnivariateDistribution`: The" +"/*\n Title: Array\n \n A light weight implementation of dynamic array structure in C Macros.\n \n Each array actually consists of the following members\n \n (start code)\n ElementType* Name; //The pointer to elements.\n int Name_Index; //Index of the top element.\n int Name_Size; //Size allocated for the Array.\n (end)\n \n \n Constant: Array_Addition\n \n The number of empty elements to be added to an Array when an\n enlargement takes place.\n \n \n Section: Constructing/Destructing\n\n Macro: Array_Define\n \n Defines an Array.\n \n Usage:\n \n > Array_Define(Type, Name);\n \n This is equivalent to\n \n (start code)\n Type* Name;\n int Name_Index;\n int Name_Size;\n (end)\n \n Parameters:\n \n Type - Type of elements(e.g. int, float ...)\n \n Name - Name of the Array.\n \n \n Macro: Array_Gtor\n \n Defines and inits an Array.\n \n Usage:\n \n > Array_Gtor(Type, Name);\n \n Parameters:\n \n Type - Type of elements.\n \n Name - Name of the Array.\n \n \n Macro: Array_Ctor\n \n Inits an Array.\n \n Usage:\n \n > Array_Gtor(Type, Name);\n \n Parameters:\n \n Type - Type of elements.\n \n Name - Name of the Array.\n \n \n Macro: Array_Dtor\n \n Destructs(frees) an Array.\n \n Usage:\n \n > Array_Dtor(Type, Name);\n \n Parameters:\n \n Type - Type of elements.\n \n Name - Name of the Array.\n \n \n Macro: Array_ObjDtor\n \n Destructs an Array and automatically calls the destructor for each of \n its elements.\n \n Usage:\n \n > Array_ObjDtor(Type, Name);\n \n Parameters:\n \n Type - Type of elements.\n \n Name - Name of the Array.\n \n \n Section: Operations" +"(** \n Functions to optimize vine expressions.\n \n Basically, constant_fold will only perform constant folding, whereas\n simplify() will also perform alpha substitution.\n \n Original author: Ivan Jager\n *)\n\nopen ExtList\nopen Vine\nopen Vine_util\n\nmodule D = Debug.Make(struct let name = \"Vine_opt\" and default=`NoDebug end)\nopen D\n\nmodule VH = Vine.VarHash\n\n\ntype alias = MayAlias | DoesAlias | PartialAlias | NoAlias\n\n\n(* some helper functions *)\n\n(* These fix* functions are duplicated from\n Execution.exec_utils. That's not ideal, but I'm not sure of a better\n place. *)\nlet fix_u1 x = Int64.logand x 0x1L\nlet fix_u8 x = Int64.logand x 0xffL\nlet fix_u16 x = Int64.logand x 0xffffL\nlet fix_u32 x = Int64.logand x 0xffffffffL\n\nlet fix_s1 x = Int64.shift_right (Int64.shift_left x 63) 63\nlet fix_s8 x = Int64.shift_right (Int64.shift_left x 56) 56\nlet fix_s16 x = Int64.shift_right (Int64.shift_left x 48) 48\nlet fix_s32 x = Int64.shift_right (Int64.shift_left x 32) 32\n\n(* drop high bits *)\nlet to64 v =\n match v with\n\tInt(t,i) -> let bits = 64 - bits_of_width t in\n\t Int64.shift_right_logical (Int64.shift_left i bits) bits\n | _ -> raise (Invalid_argument \"to64 is only for integers\")\n\n(* sign extend to 64 bits*)\nlet tos64 v =\n match v with\n\tInt(t,i) -> let" +"# Extending the Pimcore User\n\nPimcore does not allow to extend the user directly. Instead it allows to create a relation \nbetween a user and one or more Pimcore objects. This can be used to add information to a user \nor to associate one or more objects directly with a system user. \n\nThis article presents an example where the `member` object is associated with a Pimcore user. \nThe screenshots show how this can be achieved through the Pimcore backend UI. In the bottom \nthere is also an example of how the user and member objects can be created and associated \nwith each other programmatically.\n\nRegardless of the creation method of users and objects, in the first step the member class \nhas to be defined in *Settings* > *Object* > *Classes*:\n\n![Member Class Config](../img/object-user1.png)\n\nIn this example, the class `member` has the three properties `location`, `name` and `user`. \nThe class can have an arbitrary number of properties. What is important in this context is, \nthat it has a property of the type `User`. Speaking in code this would be a \n`\\Pimcore\\Model\\DataObject\\ClassDefinition\\Data\\User`.\n\nWhen creating the first object instance of the member class, you can see the input widget \nfor the user property." +"#\n# Turn off the television if nobody seems to be home.\n#\n# @subscribe group.presence\n# @subscribe group.motion_sensors\n# @subscribe switch.lounge_tv\n#\n# @publish switch.lounge_tv\n#\n\n- id: tv_off\n alias: \"Television OFF\"\n\n trigger:\n # No known person is home.\n - platform: state\n entity_id: group.presence\n to: 'not_home'\n for:\n hours: 2\n\n # Motion is no longer detected.\n - platform: state\n entity_id: group.motion_sensors\n to: 'off'\n for:\n hours: 2\n\n condition:\n # If TV is on.\n - condition: state\n entity_id: switch.lounge_tv\n state: 'on'\n\n # No known person is home.\n - condition: state\n entity_id: group.presence\n state: 'not_home'\n for:\n hours: 2\n\n # No motion has been detected.\n - condition: state\n entity_id: group.motion_sensors\n state: 'off'\n for:\n hours: 2\n\n action:\n # Turn off the TV.\n - service: homeassistant.turn_off\n data:\n entity_id:\n - switch.lounge_tv" +"// WIP is a not-for-profit application. All revenue from the \"pro\" plan is\n// donated, see \"About WIP\" in the README.md file.\n//\n// If you cannot pay for the \"pro\" plan but need the additional features, add\n// your user or organization account with a comment explanation.\n\nmodule.exports = [\n 'resistbot', // 2018-10-23: Volunteer run org, helping people engage in their democracy\n 'urbanengine', // 2019-03-29: non-profit organization helping startups & small businesses network and grow within their community\n 'moonsmile', // 2019-9-1: Student writing tools to learn, I promise i will pay when I finish my project.\n 'apache', // 2020-01-01: non-profit corporation to support Apache software projects\n 'acts-project' // 2020-03-30: Generic track reconstruction framework for high energy physics\n]" +"HTML Agility Pack V1.3.0.0\n\nThis directory contains the HtmlAgilityPack solution and source code and this readme file. The other directories are:\n\n+ Doc\t\t\tContains the HtmlAgilityPack assembly documentation as a .CHM file\n+ Samples\t\tContains some HtmlAgilityPack samples\n + GetDocLinks\t\tA sample that determines all linked files and references of a givn HTML document\n + Html2Txt\t\tA sample that converts an HTML document into a plain text .txt file\n + Html2Xml\t\tA sample that converts an HTML document into an XML file\n + HtmlToRss\t\tA sample that converts an HTML file into an RSS feed. Two methods are demonstrated.\n\nSimon Mourier\nJune 2003" +"caption: Zeilenumbruch\ncreated: 20131214165710101\ncreator: pmario\nmodified: 20140922133054642\nmodifier: ChrisK\ntags: WikiText\ntitle: Zeilenumbruch in WikiText\ntype: text/vnd.tiddlywiki\n\n!! Block Syntax\nDie \u00fcbliche Behandlung von [[Abs\u00e4tzen in WikiText|Absatz in WikiText]] ignoriert einzelne Zeilenumbr\u00fcche. Zwei Zeilenumbr\u00fcche werden als neuer Absatz interpretiert. \n\nUm trotzdem Texte mit \"hartem\" Zeilenumbruch darstellen zu k\u00f6nnen, zum Beispiel: \"Gedichte\", kann folgende Formatierung verwendet werden:\n\n<>\n\n!! HTML Syntax\n\nIn normalem Flie\u00dftext sollten keine zus\u00e4tzlichen \"harten\" Zeilenumbr\u00fcche eingef\u00fcgt werden, da diese bei unterschiedlichen Anzeigen zB: Handy's falsch dargestellt werden. \n\nIn Ausnahmef\u00e4llen sollte es jedoch m\u00f6glich sein, einen \"harten\" Zeilenumbruch zu setzen. Daf\u00fcr kann der HTML `
` \"tag\" verwendet werden. \n\n```\nErste Zeile
\nZweite Zeile\n```\n\nDargestellt als: \n\nErste Zeile
\nZweite Zeile\n\nHTML Code: \n\n```\n

Erste Zeile
Zweite Zeile

\n```" +"function! youcompleteme#test#popup#CheckPopupPosition( winid, pos )\n redraw\n let actual_pos = popup_getpos( a:winid )\n let ret = 0\n if a:pos->empty()\n return assert_true( actual_pos->empty(), 'popup pos empty' )\n endif\n for c in keys( a:pos )\n if !has_key( actual_pos, c )\n let ret += 1\n call assert_report( 'popup with ID '\n \\ . string( a:winid )\n \\ . ' has no '\n \\ . c\n \\ . ' in: '\n \\ . string( actual_pos ) )\n else\n let ret += assert_equal( a:pos[ c ],\n \\ actual_pos[ c ],\n \\ c . ' in: ' . string( actual_pos ) )\n endif\n endfor\n return ret\nendfunction\n\n\nfunction! youcompleteme#test#popup#ScreenPos( winid, row, col )\n \" Returns the screen position of the row/col in win with id winid. This\n \" differs from screenpos() only in that the position need not be valid, that\n \" is there need not be a text character in the referenced cell. This is useful\n \" when finding where a popup _should_ be in screen position relative to actual\n \" text position\n \"\n \" It also probably doesn't work properly for multi-byte characters and tabs\n \" and things. And only returns the 'row' and 'col' items of the dict.\n \"\n \" So it's not that" +"\nTo build a new release, a few steps are required. This will be enhanced in the future.\n\n1. Update FireCamp software release version\n1) Change the version in Makefile from \"latest\" to the release version such as \"1.0\".\n2) Chagne the version in common/types.go from \"latest\" to the release version such as \"1.0\".\n3) Update the \"Release\" to such as \"1.0\" and \"QSS3KeyPrefix\" to \"firecamp/releases/1.0\" in firecamp-master.template and firecamp.template\n\nUpdate the README.md and docs/installation/README.md to the new version as well.\n\nCheck whether need to update AMI ID in firecamp-autoscalegroup.template, https://aws.amazon.com/amazon-linux-ami/\n\n2. Upload the new files\nCreate the new release folder in the \"cloudstax\" bucket, such as firecamp/releases/1.0. Create subfolders: \"templates\", \"scripts\", \"packages\". Upload all templates under packages/aws-cloudformation/ to \"templates\", upload packages/aws-cloudformation/init.sh to \"scripts\", and upload $GOPATH/bin/firecamp-service-cli.tgz and $GOPATH/bin/firecamp-swarminit.tgz to \"packages\". The swarminit command is used by packaging/aws-cloudformation/init.sh to initialize the Docker Swarm cluster.\n\nNote: the default template allows '.' in the QSS3KeyPrefix. AWS QuickStart always points to the latest version, such as aws/vpc/latest for QSAWSVPCS3KeyPrefix, and '.' is not allowed in the key prefix. For the new release, when updating the AWS QuickStart git, remove '.' in the QSS3KeyPrefix and point to the latest version in QuickStart.\nAlso change \"QSS3BucketName\"" +"---\ntitle: Difference between Git and GitHub\nlocaleTitle: Diferencia entre Git y GitHub\n---\n## Diferencia entre Git y GitHub\n\nGit y Github son dos cosas diferentes. [Git](https://git-scm.com/) es el [sistema de control de versiones](https://en.wikipedia.org/wiki/Version_control) , mientras que [GitHub](https://github.com/) es un servicio para alojar repositorios de Git y ayudar a las personas a colaborar en la escritura de software. Sin embargo, a menudo se confunden por su nombre similar, debido al hecho de que GitHub se construye sobre Git, y porque muchos sitios web y art\u00edculos no hacen la diferencia entre ellos lo suficientemente clara.\n\n![Git no es GitHub](https://i.imgur.com/EkjwJdr.png)\n\n### Git\n\nGit es el sistema de control de versiones distribuido. Git es responsable de realizar un seguimiento de los cambios en el contenido, generalmente los archivos de c\u00f3digo fuente.\n\nPara m\u00e1s informaci\u00f3n, hay un [art\u00edculo completo sobre el propio Git](https://guide.freecodecamp.org/git) .\n\n### GitHub\n\nGitHub es una empresa que proporciona hosting de repositorio Git. Eso significa que proporcionan una soluci\u00f3n llave en mano para alojar repositorios Git en sus servidores. Eso puede ser \u00fatil para mantener una copia de seguridad de su repositorio (Git solo rastrea los cambios realizados en sus archivos a lo largo del tiempo, todav\u00eda se debe hacer" +"--- cvs-1.11.21/src/diff.c.old\t2005-05-27 19:17:03.000000000 +0200\n+++ cvs-1.11.21/src/diff.c\t2005-12-15 15:22:05.000000000 +0100\n@@ -955,14 +955,16 @@\n \t /* The first revision does not exist. If EMPTY_FILES is\n true, treat this as an added file. Otherwise, warn\n about the missing tag. */\n-\t if( use_rev2 == NULL || RCS_isdead( vers->srcfile, use_rev2 ) )\n+\t if( use_rev2 == NULL || RCS_isdead( vers->srcfile, use_rev2 ) ) {\n \t\t/* At least in the case where DIFF_REV1 and DIFF_REV2\n \t\t * are both numeric (and non-existant (NULL), as opposed to\n \t\t * dead?), we should be returning some kind of error (see\n \t\t * basicb-8a0 in testsuite). The symbolic case may be more\n \t\t * complicated.\n \t\t */\n-\t\treturn DIFF_SAME;\n+\t\terror (0, 0, \"no revision in file %s or missing file %s\", finfo->fullname, finfo->fullname);\n+\t\treturn DIFF_ERROR;\n+\t }\n \t if( empty_files )\n \t\treturn DIFF_ADDED;\n \t if( use_rev1 != NULL )" +"require 'open-uri'\n# Worker that takes care of comparing the snapshot image against the baseline.\n# This worker is responsible for updating the snapshot instance with the result\n# of the snapshot comparison.\nclass SnapshotComparerWorker < SnapshotWorker\n def perform(snapshot_id)\n return unless set_snapshot(snapshot_id)\n return unless @snapshot.compare?\n\n url = @snapshot.url\n viewport = @snapshot.viewport\n compare_with = @snapshot.compared_with || url.baseline(viewport)\n\n Rails.logger.info \"Comparing snapshot of #{url} @ #{viewport} \" +\n 'against baseline'\n comparison = Diffux::SnapshotComparer.new(to_chunky_png(compare_with),\n to_chunky_png(@snapshot)).compare!\n diff = @snapshot.build_snapshot_diff(comparison.slice(:diff_in_percent))\n diff.before_snapshot = compare_with\n diff_image = comparison[:diff_image]\n if diff_image\n diff.image_height = diff_image.height\n diff.image_width = diff_image.width\n FileUtil.with_tempfile do |tempfile|\n diff_image.save(tempfile)\n diff.image = File.open(tempfile)\n end\n end\n @snapshot.accept if diff.diff_in_percent == 0\n\n @snapshot.transaction do\n diff.save!\n comparison[:diff_clusters].each do |cluster|\n diff.snapshot_diff_clusters.create!(cluster)\n end\n @snapshot.compared_with = compare_with\n @snapshot.save!\n end\n end\n\n private\n\n # @param snapshot [Snapshot]\n # @return [ChunkyPNG::Image]\n def to_chunky_png(snapshot)\n case snapshot.image.options[:storage]\n when :s3\n ChunkyPNG::Image.from_io(open(snapshot.image.url))\n when :filesystem\n ChunkyPNG::Image.from_file(snapshot.image.path)\n end\n end\nend" +"(* Title: Pure/Concurrent/thread_attributes.ML\n Author: Makarius\n\nThread attributes for interrupt handling.\n*)\n\nsignature THREAD_ATTRIBUTES =\nsig\n type attributes\n val get_attributes: unit -> attributes\n val set_attributes: attributes -> unit\n val convert_attributes: attributes -> Thread.Thread.threadAttribute list\n val no_interrupts: attributes\n val test_interrupts: attributes\n val public_interrupts: attributes\n val private_interrupts: attributes\n val sync_interrupts: attributes -> attributes\n val safe_interrupts: attributes -> attributes\n val with_attributes: attributes -> (attributes -> 'a) -> 'a\n val uninterruptible: ((('c -> 'd) -> 'c -> 'd) -> 'a -> 'b) -> 'a -> 'b\n val expose_interrupt: unit -> unit (*exception Interrupt*)\nend;\n\nstructure Thread_Attributes: THREAD_ATTRIBUTES =\nstruct\n\nabstype attributes = Attributes of Word.word\nwith\n\n(* thread attributes *)\n\nval thread_attributes = 0w7;\n\nval broadcast = 0w1;\n\nval interrupt_state = 0w6;\nval interrupt_defer = 0w0;\nval interrupt_synch = 0w2;\nval interrupt_asynch = 0w4;\nval interrupt_asynch_once = 0w6;\n\n\n(* access thread flags *)\n\nval thread_flags = 0w1;\n\nfun load_word () : word =\n RunCall.loadWord (RunCall.unsafeCast (Thread.Thread.self ()), thread_flags);\n\nfun get_attributes () = Attributes (Word.andb (load_word (), thread_attributes));\n\nfun set_attributes (Attributes w) =\n let\n val w' = Word.orb (Word.andb (load_word (), Word.notb thread_attributes), w);\n val _ = RunCall.storeWord (RunCall.unsafeCast (Thread.Thread.self ()), thread_flags, w');\n in\n if Word.andb (w', interrupt_asynch) = interrupt_asynch\n then Thread.Thread.testInterrupt () else ()\n end;" +"; SPDX-License-Identifier: BSD-2-Clause\n; X-SPDX-Copyright-Text: (c) Solarflare Communications Inc\n; Config files support comments such as this line\n\n; Cluster names are case sensitive and are defined as following.\n[Cluster A]\n\n; Property names are case insensitive. But property values are case\n; sensitive.\n\n; CaptureInterface is a required property and be specified just once\nCaptureInterface = eth4\n\n; CaptureStream is a required property can be specified multiple\n; times to capture multiple streams.\nCaptureStream = dhost=239.1.2.3, dport=12345, udp\nCaptureStream = dhost=239.1.2.3, dport=12346, udp\n\n; NumChannels is optional. If not specified, default is 1\nNumChannels = 2\n\n; ProtectionMode is optional. If not specified, default is\n; EF_PD_DEFAULT. Allowed options can be looked up in\n; onload/src/include/etherfabric/pd.h\nProtectionMode = EF_PD_DEFAULT\n\n; We also support multi line properties like below\nCaptureStream = \\\n dhost=239.1.2.3,dport=12347,udp\n\n; CaptureMode is optional. The default is 'steal'. You can also set\n; it to 'sniff'. Currently, only 'all' CaptureStream can be sniffed.\nCaptureMode = steal\n\n; Promiscuous is optional. The default is 1. It does not have any\n; affect if CaptureMode is not specified. If CaptureMode is\n; specified, then this dictates whether promiscuous mode is enabled or\n; not.\nPromiscuous = 1\n\n\n; You" +"import SwiftUI\nimport Combine\n\npublic final class Store: ObservableObject where StateType: StateMachine {\n \n private let initialState: StateType\n private var subsequentStates: [StateType] = []\n\n public let objectWillChange = PassthroughSubject()\n \n public init(state: StateType) {\n initialState = state\n }\n \n var allStates: [StateType] {\n [[initialState], subsequentStates].flatMap({ $0 })\n }\n \n var stateCount: Int {\n 1 + subsequentStates.count\n }\n \n var currentStateIndex: Int = 0 {\n didSet {\n withAnimation {\n objectWillChange.send(())\n }\n }\n }\n \n /// The current state of the store. This will update as time traveling occurs.\n public var state: StateType {\n allStates[currentStateIndex]\n }\n \n /// Dispatches an event to be applied to the current state.\n public func dispatch(event: StateType.Event) {\n var newState = state\n newState.update(with: event)\n subsequentStates.append(newState)\n currentStateIndex = stateCount - 1\n }\n \n}" +"*copy pasted from Qweko Dev Discord - check pins*\n\nClient Authorization Example (For using Client's methods)\n```py\nimport pypresence\nc = pypresence.Client(CLIENT_ID)\nc.start()\nauth = c.authorize(CLIENT_ID, ['rpc']) # If you need other scopes, add them\ncode_grant = auth.code\n# Exchange code grant for token, rtfd\n# Now each time you can use\nc.authenticate(oauth2token)\n```\nYou *will* need a redirect URI set in your application. localhost/randomendpoint should work, but you'll also have to listen for any requests sent to it. It'll return an oauth2 code, which you send to the Token URL (read the docs), in return for a Oauth2 token to use with `Client.authenticate(token)` that means they dont need to auth it every time." +"#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n\"\"\"Simulator tests.\n\nExample on how to load different things with the simulators. This example is still in an experimental phase. For\nnow, only Bullet is fully-supported. We are working on the other ones, especially the Mujoco simulator.\n- Bullet: OK\n- Raisim: OK (todo: for collision bodies, it only accepts OBJ files)\n- MuJoCo: OK (todo: control still missing)\n- DART: OK, but capsules don't have collision shapes... (todo: fix some URDFs)\n- VREP: Not implemented yet + problem when importing PyRep with pybullet. Also, need to figure out how to call the\n 'loadURDF' plugin.\n- Isaac: not available yet.\n\"\"\"\n\nimport os\nfrom itertools import count\n\nfrom pyrobolearn.simulators.bullet import Bullet\nfrom pyrobolearn.simulators.raisim import Raisim\nfrom pyrobolearn.simulators.dart import Dart\nfrom pyrobolearn.simulators.mujoco import Mujoco\n# from pyrobolearn.simulators.vrep import VREP # Problem when importing PyRep with Pybullet\n# from pyrobolearn.simulators.isaac import Isaac # Not available yet\n\n\nsim = Bullet(render=True)\n# sim = Raisim(render=True)\n# sim = Dart(render=True)\n# sim = Mujoco(render=True)\n# sim = VREP(render=True)\n# sim = Isaac(render=True)\nprint(\"Gravity: {}\".format(sim.get_gravity()))\n\n# load floor\nfloor = sim.load_floor(dimension=20)\n\n# create box\nbox = sim.create_primitive_object(sim.GEOM_BOX, position=(0, 0, 2), mass=1, rgba_color=(1, 0, 0, 1))\nsphere = sim.create_primitive_object(sim.GEOM_SPHERE, position=(2," +"INCLUDES += -I$(RIOTBASE)/pkg/lvgl/include\nINCLUDES += -I$(PKGDIRBASE)\n\n# Don't use relative includes in lvgl\nCFLAGS += -DLV_CONF_INCLUDE_SIMPLE\n\nifneq (,$(filter lvgl_contrib,$(USEMODULE)))\n DIRS += $(RIOTBASE)/pkg/lvgl/contrib\nendif\n\n# Configuration options\n# Graphical settings\nLVGL_COLOR_DEPTH ?= 16\nLVGL_COLOR_16_SWAP ?= 1\n\n# Memory settings\nLVGL_MEM_SIZE ?= 5U*1024U\n\n# Engine settings\nLVGL_INACTIVITY_PERIOD_MS ?= 5*MS_PER_SEC # 5s\nLVGL_TASK_HANDLER_DELAY_US ?= 5*US_PER_MS # 5ms\nLVGL_TASK_THREAD_PRIO ?= THREAD_PRIORITY_MAIN-1\n\n# Set the CFLAGS variable accordingly\nCFLAGS += -DLV_COLOR_DEPTH=$(LVGL_COLOR_DEPTH)\nCFLAGS += -DLV_COLOR_16_SWAP=$(LVGL_COLOR_16_SWAP)\nCFLAGS += -DLV_MEM_SIZE=$(LVGL_MEM_SIZE)\nCFLAGS += -DLVGL_INACTIVITY_PERIOD_MS=$(LVGL_INACTIVITY_PERIOD_MS)\nCFLAGS += -DLVGL_TASK_HANDLER_DELAY_US=$(LVGL_TASK_HANDLER_DELAY_US)\nCFLAGS += -DLVGL_TASK_THREAD_PRIO=$(LVGL_TASK_THREAD_PRIO)\n\n# lvgl module is not a concrete module, so declare it as a pseudomodule\nPSEUDOMODULES += lvgl\n\n# touch capabilities are available via a pseudomodule\nPSEUDOMODULES += lvgl_contrib_touch" +"---\ndescription: How to to tune the computation of forces and stresses\nauthors: FJ\n---\n\n\nThis page gives hints on how to to tune the computation of forces and stresses with the ABINIT package.\n\n## Introduction\n\nHellman-Feynman forces are computed from an analytical formula, and\ncorresponds exactly to the limit of finite differences of energy for\ninfinitesimally small atomic displacements when the ground-state calculation\nis at convergence. This feature is available for all the cases where the total\nenergy can be computed. A correction for non-converged cases allows to get\naccurate forces with less converged wavefunctions than without it. The\ndecomposition of the forces in their different components can be provided.\n\nStress can also be computed. This feature is available for all the cases where\nthe total energy can be computed (except wavelets). The decomposition of the\nstresses in their different components can be provided. A smearing scheme\napplied to the kinetic energy [[ecutsm]] allows one to get smooth energy\ncurves as a function of lattice parameters and angles. A target stress can be\ngiven by the user ([[strtarget]]), the geometry optimization algorithm will\ntry to find" +"StartChar: afii10040\nEncoding: 1062 1062 774\nWidth: 1060\nFlags: W\nHStem: 0 150<220 697> 1276 20G<61.9537 220 698.965 856>\nVStem: 59 161<150 1296> 697 158<158 1296> 855 147<-259.94 0>\nLayerCount: 2\nBack\nSplineSet\n220 150 m 1xe8\n 681 150 l 1\n 683 1296 l 1\n 840 1296 l 1\n 839 158 l 1xf0\n 988 158 l 1\n 987 128 986 95 986 61 c 0\n 986 -52 994 -178 1026 -264 c 1\n 887 -299 l 1\n 854 -210 841 -110 839 0 c 1\n 682 0 l 1\n 59 -1 l 1\n 62 1296 l 1\n 220 1296 l 1\n 220 150 l 1xe8\nEndSplineSet\nFore\nSplineSet\n220 150 m 1xe8\n 697 150 l 1\n 699 1296 l 1\n 856 1296 l 1\n 855 158 l 1xf0\n 1004 158 l 1\n 1003 128 1002 95 1002 61 c 0\n 1002 -52 1010 -178 1042 -264 c 1\n 903 -299 l 1\n 870 -210 857 -110 855 0 c 1\n 698 0 l 1\n 59 -1 l 1\n 62 1296 l 1\n 220 1296 l 1\n 220 150 l 1xe8\nEndSplineSet\nValidated: 1\nEndChar" +"import logging\nimport math\nimport tkinter as tk\nfrom typing import TYPE_CHECKING, Optional, Tuple\n\nfrom core.api.grpc.wrappers import Interface, Link\nfrom core.gui import themes\nfrom core.gui.dialogs.linkconfig import LinkConfigurationDialog\nfrom core.gui.frames.link import EdgeInfoFrame, WirelessEdgeInfoFrame\nfrom core.gui.graph import tags\nfrom core.gui.nodeutils import NodeUtils\nfrom core.gui.utils import bandwidth_text, delay_jitter_text\n\nif TYPE_CHECKING:\n from core.gui.graph.graph import CanvasGraph\n\nTEXT_DISTANCE: float = 0.30\nEDGE_WIDTH: int = 3\nEDGE_COLOR: str = \"#ff0000\"\nWIRELESS_WIDTH: float = 3\nWIRELESS_COLOR: str = \"#009933\"\nARC_DISTANCE: int = 50\n\n\ndef create_edge_token(src: int, dst: int, network: int = None) -> Tuple[int, ...]:\n values = [src, dst]\n if network is not None:\n values.append(network)\n return tuple(sorted(values))\n\n\ndef arc_edges(edges) -> None:\n if not edges:\n return\n mid_index = len(edges) // 2\n if mid_index == 0:\n arc_step = ARC_DISTANCE\n else:\n arc_step = ARC_DISTANCE / mid_index\n # below edges\n arc = 0\n for edge in edges[:mid_index]:\n arc -= arc_step\n edge.arc = arc\n edge.redraw()\n # mid edge\n if len(edges) % 2 != 0:\n arc = 0\n edge = edges[mid_index]\n edge.arc = arc\n edge.redraw()\n mid_index += 1\n # above edges\n arc = 0\n for edge in edges[mid_index:]:\n arc += arc_step\n edge.arc = arc\n edge.redraw()\n\n\nclass Edge:\n tag: str = tags.EDGE\n\n def __init__(self, canvas: \"CanvasGraph\", src: int, dst: int = None) -> None:\n self.canvas" +"# Release Leads\n\nFor each release cycle, we dedicate a team of two individuals, one from Eventing\nand one from Serving, to shepherd the release process. Participation is\nvoluntary and based on good faith. We are only expected to participate during\nour local office hour.\n\n# Roster\n\nWe seed this rotation with all approvers from all the Serving and Eventing\nworkgroups, excluding productivity. If you are no longer active in Knative, or\nif you are contributing on personal capacity and do not have time to contribute\nin the rotation, feel free to send a PR to remove yourself.\n\n## Serving roster\n\nThis roster is seeded with all approvers from Serving workgroups.\n\n- dprotaso\n- julz\n- JRBANCEL\n- markusthoemmes\n- mattmoor\n- nak3\n- tcnghia\n- vagababov\n- yanweiguo\n- ZhiminXiang\n\n## Eventing roster\n\nThis roster is seeded with all approvers from Eventing workgroups.\n\n- evankanderson\n- grantr\n- Harwayne\n- lionelvillard\n- matzew\n- n3wscott\n- nachocano\n- slinkydeveloper\n- vaikas\n\n## Schedule\n\n| Release | Release Date | Serving | Eventing | Unpin repos | PKG cut |\n| ------- | ------------ | -------------- | --------------- | ----------- | ---------- |\n| v0.17 | 2020-08-18 | yanweiguo |" +"#!/bin/bash\n#\n# This script runs all tests for the root CDK project, as well as any microservices, Lambda functions, or dependency \n# source code packages. These include unit tests, integration tests, and snapshot tests.\n# \n# This script is called by the ../initialize-repo.sh file and the buildspec.yml file. It is important that this script \n# be tested and validated to ensure that all available test fixtures are run.\n#\n# The if/then blocks are for error handling. They will cause the script to stop executing if an error is thrown from the\n# node process running the test case(s). Removing them or not using them for additional calls with result in the \n# script continuing to execute despite an error being thrown.\n\n# Save the current working directory\nsource_dir=$PWD\n\n# Test the CDK project\nnpm install\nnpm run test\nif [ \"$?\" = \"1\" ]; then\n\techo \"(source/run-all-tests.sh) ERROR: there is likely output above.\" 1>&2\n\texit 1\nfi\n\n# Return to the source/ level\ncd $source_dir" +"import os\nimport json\n\n''' \tdotfile is a class for managing the local dotfile storage \n\t\t\t\n\t\t\tsaves a file called, '.fu' to your home directory\n\t\t\tthe file is format a json file\n\t\t\t\n\t\t\t{ \n\t\t\t\tresult\t: Last Search Result\n\t\t\t\tlast\t\t: Last Copied command\n\t\t\t\thistory\t: History of used commands\n\t\t\t}\n\n\t\t\tthe entire dot file is rewritten on every write\n\n'''\nclass dotfile:\n\t\t\n\t\t''' Initialize the dotfile '''\n\t\tdef __init__(self):\n\t\t\t\tself.path = os.path.join(os.getenv(\"HOME\"),'.fu')\n\t\t\t\tself.__load()\n\t\t\t\tself.history_size = 30\n\n\t\t''' Get the history of all the copied command '''\n\t\tdef history(self):\n\t\t\t\treturn self._history\n\t\t\t\t\n\t\t''' Get the command that was copied'''\n\t\tdef last(self):\n\t\t\t\treturn self._last\n\n\t\t''' Get the last search result '''\n\t\tdef result(self):\n\t\t\t\treturn str(self._result)\n\n\t\t''' Save the search result dotfile '''\n\t\tdef save_result(self,string):\n\t\t\t\tself._result = string\n\n\t\t''' Copy will add the string to be copied to the dotfile '''\n\t\tdef save_copy(self,string):\n\t\t\t\tself._last = string\n\t\t\t\tself.__record(string)\n\t\t\n\t\t''' Record will add a command to the history '''\n\t\tdef __record(self,string):\n\n\t\t\t\t# If we are at capacity, remove\n\t\t\t\tif len(self._history) >= self.history_size : \n\t\t\t\t\t\tself._history = self.history[:used_size]\n\t\t\t\t\n\t\t\t\t# Prepend to the history\n\t\t\t\tself._history.insert(0,string)\n\n\t\t''' Private file for loading the dotfile '''\n\t\tdef __load(self):\n\t\t\t\t\n\t\t\t\t# If the file doesn't exist make it\n\t\t\t\tif not os.path.isfile(self.path):\n\t\t\t\t\t\tself.__make()\n\t\t\t # Read the" +"{% comment %}\nLoops through all reviewer, translator, and translation editor data to create an acknowledgement sentence. This is used both on project-team.md as well as es/equipo-de-proyecto.md\n{% endcomment %}\n\n{% assign alllessons = site.pages | where: \"lesson\" , \"true\" %}\n\n{% assign reviewers = alllessons | map: \"reviewers\" %}\n{% assign editors = alllessons | map: \"editors\" %}\n{% assign translators = alllessons | map: \"translator\" %}\n{% assign translation_editors = alllessons | map: \"translation-editors\" %}\n{% assign translation_reviewers = alllessons | map: \"translation-reviewer\" %}\n\n{% assign allpersons = reviewers | concat: editors | concat: translators | concat: translation_editors | concat: translation_reviewers %}\n\n{% assign contributors = allpersons | compact | uniq %}\n\n{{ contributors | join: \", \" }}," +"#pragma once\n\nclass LineSegment;\nclass PuresoftRasterizer\n{\npublic:\n\ttypedef struct\n\t{\n\t\tfloat x, y;\n\t} VERTEX2F;\n\n\ttypedef struct\n\t{\n\t\tint left;\n\t\tint leftClamped;\n\t\tint leftVerts[2]; // indices of verts for left point\n\t\tint right;\n\t\tint rightClamped;\n\t\tint rightVerts[2]; // indices of verts for right point\n\t} RESULT_ROW;\n\n\ttypedef struct\n\t{\n\t\tVERTEX2F vertices[3];\n\t\tint firstRow;\n\t\tint lastRow;\n\t\tRESULT_ROW* m_rows;\n\t} RESULT;\n\npublic:\n\tPuresoftRasterizer(void);\n\t~PuresoftRasterizer(void);\n\n\tconst RESULT* initialize(int width, int height);\n\tbool pushTriangle(const float* vert0, const float* vert1, const float* vert2);\n\nprivate:\n\tint m_width;\n\tint m_height;\n\tint m_halfWidth;\n\tint m_halfHeight;\n\tint m_resultCapacity;\n\tRESULT m_output;\n\n\tinline void pushVertex(int idx, const float* vert);\n\tinline void processTriangle(const LineSegment& edgeL, const LineSegment& edgeR, float yMin, float yMax);\n\tinline void processStandingTriangle(const VERTEX2F* verts, int top, int bottom, int third);\n};" +"gandi-live-dns\n----\n\nThis is a simple dynamic DNS updater for the\n[Gandi](https://www.gandi.net) registrar. It uses their [LiveDNS REST API](http://doc.livedns.gandi.net/) to update the zone file for a subdomain of a domain to point at the external IPv4 address of the computer it has been run from.\n\nIt has been developed on Debian 8 Jessie and tested on Debian 9 Stretch GNU/Linux using Python 2.7.\n\nWith the new v5 Website, Gandi has also launched a new REST API which makes it easier to communicate via bash/curl or python/requests. \n\n### Goal\n\nYou want your homeserver to be always available at `dynamic_subdomain.mydomain.tld`.\n\n### Debian Package Requirements\n\n`apt-get update && apt-get upgrade && apt-get install unzip python-requests python-args python-simplejson`\n\n#### API Key\nFirst, you must apply for an API key with Gandi. Visit \nhttps://account.gandi.net/en/ and apply for (at least) the production API \nkey by following their directions.\n\n#### A DNS Record \nCreate the DNS A Records in the GANDI Webinterface which you want to update if your IP changes. \n\n#### Git Clone or Download the Script\nDownload the Script from here as [zip](https://github.com/cavebeat/gandi-live-dns/archive/master.zip)/[tar.gz](https://github.com/cavebeat/gandi-live-dns/archive/master.tar.gz) and extract it. \n\nor clone from git\n\n`git clone https://github.com/cavebeat/gandi-live-dns.git` \n\n#### Script Configuration\nThen you'd need to configure the script in the" +"# Compression\n\n## Client side\nThe preferred method for configuring message compression on a client is to pass `options` when the client object is instantiated.\n\nThese two options control compression behavior:\n\n**grpc.default_compression_algorithm** (int)\n\nDefault compression algorithm for the channel, applies to sending messages.\n\nPossible values for this option are:\n- `0` - No compression\n- `1` - Compress with DEFLATE algorithm\n- `2` - Compress with GZIP algorithm\n- `3` - Stream compression with GZIP algorithm\n\n**grpc.default_compression_level** (int)\n\nDefault compression level for the channel, applies to receiving messages.\n\nPossible values for this option are:\n- `0` - None\n- `1` - Low level\n- `2` - Medium level\n- `3` - High level\n\n### Code example\n```javascript\nclient = new ExampleClient(\"example.com\", credentials.createInsecure(), {'grpc.default_compression_algorithm': 2, 'grpc.default_compression_level': 2});\n```" +"configs:\n buildkitd-config:\n buildkitd.toml: |\n debug=true\n [gc]\n enabled=false\n [worker.oci]\n enabled=true\n gc=false\n gckeepstorage=1073741824\n [grpc]\n address = [ \"tcp://0.0.0.0:8080\" ]\n # debugAddress is address for attaching go profiles and debuggers.\n debugAddress = \"0.0.0.0:6060\"\n\nservices:\n buildkitd:\n version: v0\n serviceMesh: false\n image: \"moby/buildkit:v0.6.1\"\n ports:\n - 8080/http,buildkit,expose=false\n privileged: true\n configs:\n - buildkitd-config:/etc/buildkit\n containers:\n - name: registry\n image: \"registry:2\"\n env:\n - REGISTRY_HTTP_ADDR=0.0.0.0:80\n ports:\n - 80:80/tcp,registry,expose=false\n volume:\n - rio-registry:/var/lib/registry,persistent=${PERSISTENT}\n webhook:\n version: v0\n global_permissions:\n - \"* gitwatcher.cattle.io/gitwatchers\"\n - \"* gitwatcher.cattle.io/gitcommits\"\n - '* configmaps'\n - '* events'\n - get,list pods\n - create,get,list /pods/log\n - secrets\n image: rancher/gitwatcher:v0.4.5\n args:\n - gitwatcher\n - --listen-address\n - :8090\n imagePullPolicy: always\n ports:\n - 8090/http,http-webhook\n\ntemplate:\n envSubst: true\n questions:\n - variable: PERSISTENT\n description: \"Use PV to store registry data\"" +"---\ntitle: \"Knowing grids means breaking grids\"\nexcerpt: \"Exploring what it means to develop a grid system that helps facilitate strong design with purpose.\"\nlast_modified_at: 2013-04-26\nimage: \n path: &image /assets/images/knowing-grids-feature.jpg\n width: 1280\n height: 640\n feature: *image\ntwitter:\n card: summary_large_image\ncategories: [notes]\ntags: [design, inspiration]\nsupport: false\ntoc: true\n---\n\nOf all the things to grab my attention, the grid system used in a women's health calendar was certainly not high on my list.\n\nIt's a funny story that will probably make less sense after I describe this awful analogy, but here it goes. You ever do something for so long that you start seeing it creep up in random places? Back in 1997, after I got a Nintendo 64 and played *Super Mario* all night instead of working on my 3D design projects. I started getting these crazy ideas that I could triple jump over ponds or reach the top of buildings by doing a back flip like that mustachioed plumber.\n\nSitting in 2D design and typography classes for 4 years of my life had a similar effect. It was like having a magician expose all his secrets to me. Problem is, I can no longer take these tricks at" +"## 0x Contribution Guide\n\nWe welcome contributions from anyone on the internet and are grateful for even the smallest contributions. This document will help get you setup to start contributing back to 0x.\n\n### Getting started\n\n1. Fork `0xproject/0x-monorepo`\n2. Clone your fork\n3. Follow the [installation & build steps](https://github.com/0xProject/0x-monorepo#install-dependencies) in the repo's top-level README.\n4. Setup the recommended [Development Tooling](#development-tooling).\n5. Open a PR with the `[WIP]` flag against the `development` branch and describe the change you are intending to undertake in the PR description. (see [our branch naming conventions](#branch-structure))\n\nBefore removing the `[WIP]` tag and submitting the PR for review, make sure:\n\n- It passes our linter checks (`yarn lint`)\n- It is properly formatted with Prettier (`yarn prettier`)\n- It passes our continuous integration tests (See: [Enabling code coverage checks on your fork](#enabling-code-coverage-checks-on-your-fork) for instructions on getting the `submit-coverage` test to pass on forks)\n- You've created/updated the corresponding [CHANGELOG](#CHANGELOGs) entries.\n- Your changes have sufficient test coverage (e.g regression tests have been added for bug fixes)\n\n### Branch structure\n\nWe have two main branches:\n\n- `master` represents the most recently released (published on npm) version of the codebase.\n- `development` represents the current development state of" +"NB: All recent commits are available online.\nThis file will not be updated further.\nSee https://github.com/gggeek/phpxmlrpc/commits/master\n\n2014-05-26 - G. Giunta (giunta.gaetano@gmail.com)\n\n\t* removed bundled phpunit\n\t* converted all tabs to spaces in php files and removed closing tags\n\n2014-05-12 - Samu Voutilainen (smar@smar.fi)\n\n\t* removed obsolete xml.so open; dl() is deprecated and removed from 5.3.0\n\t* removed deprecated xmlEntities\n\t* removed deprecated xmlrpc_backslash\n\t* converted $GLOBALS to internal class. This makes testing much easier and should be more flexible regarding other projects\n\t* changed verifyhost from 1 to 2. This makes modern php versions work properly.\n\t* split off each class in its own file\n\n2014-02-03 - G. Giunta (giunta.gaetano@gmail.com)\n\n\t* bumped up requirements to php 5.1.0\n\n2014-01-10 - G. Giunta (giunta.gaetano@gmail.com)\n\n\t* xmlrpc.inc: when using curl and keepalive, reset curl handle if we did not get back an http 200 response (eg a 302)\n\n\t* testsuite.php, parse_args.php: update testsuite\n\n\t* debugger/controller.php: change default path to javascript debugger\n\n2010-05-23 - G. Giunta (giunta.gaetano@gmail.com)\n\n\t* xmlrpc.inc: omit port on http 'Host' header if it is 80;\n\tadd a namespace declaration in response if ex:nil is in use\n\n2010-04-12 - G. Giunta (giunta.gaetano@gmail.com)\n\n\t* testsuite.php, parse_args.php: testsuite allows interrogating https servers ignoring" +"# Defining a Lagom build\n\nAs already discussed in [[Lagom build philosophy|BuildConcepts]], with Lagom you are free to combine all your services in a single build, or build them individually.\n\nBelow, we describe how to make a single build containing all your services. The `hello` sample follows this structure.\n\nThen, in the next section, we'll describe the alternative approach of one build per service.\n\n## Understanding your project structure\n\nEvery service contains at least two parts: an API project and an implementation project. (These are subprojects within the same build.)\n\nThe API project contains the service interface, also known as the descriptor, along with all the data models that the interface uses, e.g. request and response messages. The API project can be depended on and consumed by other services.\n\nThe implementation project will naturally also depend on the API project, in order to implement it.\n\nConsider the sample system below:\n\n![Lagom project structure](resources/guide/build/lagom-project-structure.png)\n\nThis system has two services, one called `hello`, and one called `hello-stream`. Each service has two sbt projects defined, an API project, `hello-api` and `hello-stream-api`, and an implementation project, `hello-impl` and `hello-stream-impl`. Additionally, `hello-stream-impl` depends on `hello-api`, and uses that to invoke calls on `hello`.\n\n* [Defining" +"===== Stopping a Node\nInstalled nodes are started automatically when the SymmetricDS server is started. An individual node instance can be stopped while other nodes continue to run.\n\nifdef::pro[]\nTo stop a node, select the node you want to stop and click on the *Control* button and choose *Stop*. The node's status will indicate that is has been stopped.\nendif::pro[]\n\nFrom the command line, you can use JMX to stop a node. The following is an example. You would replace with the name of the engine as found in the <>\n\n`bin/jmx --bean org.jumpmind.symmetric.:name=Node --method stop`\n\n===== Uninstalling a Node \n\nUninstalling a node will remove all SymmetricDS database artifacts and delete the engine's property file.\n\nWARNING: This can not be undone so be sure to uninstall with caution.\n\nifdef::pro[]\nTo uninstall a node, select the node you want to uninstall and click on the _Control_ button and choose _Uninstall_.\n\n.Control->Uninstall\nimage::uninstall.png[]\n\nIf the node has no children, you will be prompted by a confirm dialog to make sure you want to uninstall the node.\n\n.Uninstall single node\nimage::uninstall-single.png[]\n\nIf the node has child nodes you will be told that uninstalling the parent node will uninstall" +"\n# Mathematical Optimization\n\nMathematical optimization involves the maximization or minimization of a function through clever choices of the input variables and the analysis of the the output. They are mostly iterative as the underlying function is not easily analysed.\n\nBelow is the comparison of the optimization algorithms that we use and their general use cases.\n\n## Gradient Ascent\n\nGradient Ascent involves using the first order derivative to choose the next point. It will then move in the direction of the fastest increasing or decreasing gradient. As it passes the maximum, the step size decreases. Once the gradient is close enough to zero or the step size is small enough, the optmization algorithm exits.\n\nPros:\n * Very good at finding the the optimum point of a smooth unimodal function.\n * Parallelizable\n * Various configurable attributes to speed up convergence\n\nCons:\n * The gradient must be known at all points.\n * Has trouble with noisy functions\n * Can become stuck in local optimums.\n * Somewhat slow in convergence\n\n## Parallel Gradient Ascent\n\nParallel Gradient Ascent combines multiple independent Gradient Ascents together and operates on them as a set. If two inputs are near to each other, the two Gradient Ascents are combined" +"---\nlayout: index\ntitle: Move an object in a direction\n---\n\n\nOccasionally you might want to have the player move an object from one room to another by pushing or pulling it, rather than carrying it. Let us suppose there is a heavy crate; the player cannot lift it, but she could push it into the room to the west, then climb up it to get to a trapdoor in the ceiling.\n\nBasic Command\n-------------\n\nThis is pretty easy to do in its simplest form. We need a new command, with this pattern:\n\n> push #object# #exit#\n\nThe script will only run if both the object and exit have a match, so all the script has to do is move the object to the destination of the exit (which is set in its \"to\" attribute), and tell the player:\n\n```\nmsg (\"You push \" + object.article + \" \" + exit.alias + \".\")\nobject.parent = exit.to\n```\n\nYou might also want the player to end up in the other room - I am not sure what the player would expect. If so, then just add an extra line.\n\n```\nmsg (\"You push \" + object.article + \" \" + exit.alias" +"---\ndescription: A Metadata Provider is a JavaScript function that acts as an interface for accessing metadata related to Images in Cornerstone.\n---\n\n# Metadata Providers\n\n> A **Metadata Provider** is a JavaScript function that acts as an interface for accessing metadata related to Images in Cornerstone. Users can define their own provider functions in order to return any metadata they wish for each specific image.\n\nMedical images typically come with lots of non-pixel-wise metadata such as for example, the pixel spacing of the image, the patient ID, or the scan acquisition date. With some file types (e.g. DICOM), this information is stored within the file header and can be read and parsed and passed around your application. With others (e.g. JPEG, PNG), this information needs to be provided independently from the actual pixel data. Even for DICOM images, however, it is common for application developers to provide metadata independently from the transmission of pixel data from the server to the client since this can considerably improve performance.\n\nTo handle these scenarios, Cornerstone provides infrastructure for the definition and usage of *Metadata Providers*. Metadata Providers are simply functions which take in an [Image Id](image-ids.md) and specified metadata type, and return" +"# Set to linux or name of OS\nARG GO_OS\n\n# Set to arch name of your system\nARG GO_ARCH\n\n# Terraform Version\nARG TERRAFORM_VERSION=0.12.0\n\n# Grab the Terraform binary\nFROM hashicorp/terraform:$TERRAFORM_VERSION AS terraform\n\n# Grab the Terraform Libvirt Provider binary\nFROM provider-libvirt:v0.5.2-debian AS libvirt\n\n# Base Image\nFROM ubuntu:18.04\n\nARG GO_OS\nENV GO_OS=$GO_OS\nARG GO_ARCH\nENV GO_ARCH=$GO_ARCH\n\n# Set working directory\nWORKDIR /root/\n\n# Make Directory for Provider Binaries\nRUN mkdir -p /root/.terraform.d/plugins/${GO_OS}_${GO_ARCH}/\n\n# Copy binaries from containers\nCOPY --from=terraform /bin/terraform /bin/\nCOPY --from=libvirt /go/bin/terraform-provider-libvirt /root/.terraform.d/plugins/${GO_OS}_${GO_ARCH}/\n\n# Install Dependencies\n# libvirt0 is needed to run the provider. xsltproc needed to use XML/XSLT. mkisofs needed to use cloud init images\n# ca-certificates to avoid terraform init 509 error. openssh-client to talk to remote libvirt server\nRUN apt-get update \\\n && apt-get upgrade -y \\\n && apt-get install \\\n -y --no-install-recommends \\\n libvirt0 xsltproc mkisofs ca-certificates openssh-client \\\n && rm -rf /var/lib/apt/lists/*\n\n# Copy Terraform Files\n# COPY libvirt.tf /root/\n\n# Terraform commands\n# RUN terraform init\n\nENTRYPOINT /bin/bash" +"============\nDomain model\n============\n\nThe different entities in a model reference each other, and input data must\nthus be entered in the correct order. This list show the correct order and\ndependencies. Entities with a higher indentation depend on entities with\nless indentation.\n\nStart populating the entities at the top of the list and work your way down.\n\n| :doc:`customers` (references itself)\n| :doc:`setup-matrices`\n| :doc:`skills`\n| :doc:`calendars`\n| :doc:`Calendar bucket ` (references calendars)\n| :doc:`locations` (references calendars and itself)\n| :doc:`suppliers` (references calendars and itself)\n| :doc:`resources` (references setup matrices, calendars, locations and itself)\n| :doc:`items` (references itself)\n| :doc:`operations` (references items, locations and customers)\n| :doc:`sales-orders` (references items, customers, operations, locations and itself)\n| :doc:`buffers` (references items, locations, calendars and itself)\n| :doc:`operation-resources` (references resources, skills and operations)\n| :doc:`operation-materials` (references items and operations)\n| :doc:`item-suppliers` (references suppliers, items, resources and locations)\n| :doc:`item-distributions` (references locations, items and resources)\n| :doc:`resource-skills` (references skills and resources)\n| :doc:`Sub operation ` (references operations)\n| :doc:`manufacturing-orders` (references operations)\n| :doc:`purchase-orders` (references items, suppliers and locations)\n| :doc:`distribution-orders` (references items and locations)\n| :doc:`buckets`\n| :doc:`parameters`\n\n.. image:: _images/dependencies.png\n :alt: Model dependencies\n\nNote that it is pretty straightforward to extend the data" +".. algorithm::\n\n.. summary::\n\n.. relatedalgorithms::\n\n.. properties::\n\nDescription\n-----------\n\nGiven an initial UB at some goniometer configuration, this algorithm facilitates\nthe 'linking' of the UBs across orientations - in other words, ensuring the\ncontinuity of the indexing of reflections throughout reciprocal space allowing\nfor grouping of reflections and refinement.\n\nOn chopper instruments, when the sample is lowered into the\nblockhouse there is often no possibility to adjust its position. When rotating the\ncrystal via the goniometer, since the crystal is likely not centred exactly, the\npredicted peaks from the initial UB often do not capture the data. As well as \nconsistently indexing the peaks, the algorithm also effectively carries out a U\nmatrix correction that accounts for sample miscentering. Use of this algorithm \nwill result in a seperate UB matrix for each orientation which can then be used \nfor integration. \n\nThe algorithm requires a set of predicted peaks that have been generated from\nthe initial UB via goniometer rotation, a set of observed (found) peaks, and\nthe lattice parameters in order to calculate the B matrix. A search within a\nQ-envelope is carried out in which all peaks within the envelope are screened\nas potential 'matches' to the observed" +"StartChar: exclamdown\nEncoding: 161 161 40\nWidth: 275\nVWidth: 0\nFlags: HMW\nLayerCount: 3\nFore\nSplineSet\n125 374 m 24\n 92 374 65 401 65 434 c 24\n 65 467 92 494 125 494 c 24\n 158 494 185 467 185 434 c 24\n 185 401 158 374 125 374 c 24\n126 -180 m 0\n 90 -180 74 -145 74 -106 c 0\n 74 -43 87 87 97 205 c 0\n 99 234 110 263 125 263 c 0\n 135 263 145 233 147 211 c 0\n 159 95 174 -21 174 -108 c 0\n 174 -144 159 -180 126 -180 c 0\nEndSplineSet\nEndChar" +"title: Checklist de \u00c9tica na Ci\u00eancia de Dados\nsections: \n - title: Coleta de dados\n section_id: A\n lines:\n - line_id: A.1\n line_summary: Consentimento informado\n line: se houver sujeitos humanos, eles deram consentimento informado, onde os sujeitos afirmativamente optaram por se inscrever e t\u00eam uma compreens\u00e3o clara dos usos dos dados para que consentiram?\n - line_id: A.2\n line_summary: Vi\u00e9s de coleta\n line: Consideramos as fontes de vi\u00e9s que poderiam ser introduzidas durante a coleta de dados e o desenho da pesquisa e tomamos medidas para mitig\u00e1-los?\n - line_id: A.3\n line_summary: Limita\u00e7\u00e3o de exposi\u00e7\u00e3o de IIP\n line: Consideramos maneiras de minimizar a exposi\u00e7\u00e3o de informa\u00e7\u00e3o de identifica\u00e7\u00e3o pessoal (IIP), por exemplo atrav\u00e9s de anonimiza\u00e7\u00e3o ou por n\u00e3o coletar informa\u00e7\u00e3o irrelevante para a an\u00e1lise?\n - title: Armazenamento de dados\n section_id: B\n lines:\n - line_id: B.1\n line_summary: Seguran\u00e7a de dados\n line: Temos um plano para proteger e dar seguran\u00e7a aos dados (por exemplo, criptografia no armazenamento e em tr\u00e2nsito, controles de acesso dos usu\u00e1rios internos e terceiros, registros de acesso e softwares atualizados)?\n - line_id: B.2\n line_summary: Direito ao esquecimento\n line: Temos um mecanismo pelo qual um indiv\u00edduo pode requerer a remo\u00e7\u00e3o de sua informa\u00e7\u00e3o pessoal?\n - line_id: B.3\n line_summary: Plano de reten\u00e7\u00e3o de dados" +"// TODO: Remove these globals by rewriting gamedescription.js\nconst g_MapSizes = prepareForDropdown(g_Settings && g_Settings.MapSizes);\nconst g_MapTypes = prepareForDropdown(g_Settings && g_Settings.MapTypes);\nconst g_PopulationCapacities = prepareForDropdown(g_Settings && g_Settings.PopulationCapacities);\nconst g_WorldPopulationCapacities = prepareForDropdown(g_Settings && g_Settings.WorldPopulationCapacities);\nconst g_StartingResources = prepareForDropdown(g_Settings && g_Settings.StartingResources);\nconst g_VictoryConditions = g_Settings && g_Settings.VictoryConditions;\n\n/**\n * Offer users to select playable civs only.\n * Load unselectable civs as they could appear in scenario maps.\n */\nconst g_CivData = loadCivData(false, false);\n\n/**\n * Whether this is a single- or multiplayer match.\n */\nconst g_IsNetworked = Engine.HasNetClient();\n\n/**\n * Is this user in control of game settings (i.e. is a network server, or offline player).\n */\nconst g_IsController = !g_IsNetworked || Engine.HasNetServer();\n\n/**\n * Central data storing all settings relevant to the map generation and simulation.\n */\nvar g_GameAttributes = {};\n\n/**\n * Remembers which clients are assigned to which player slots and whether they are ready.\n * The keys are GUIDs or \"local\" in single-player.\n */\nvar g_PlayerAssignments = {};\n\n/**\n * This instance owns all handlers that control the two synchronized states g_GameAttributes and g_PlayerAssignments.\n */\nvar g_SetupWindow;\n\n// TODO: Remove these two global functions by specifying the JS class name in the XML of the GUI page.\n\nfunction init(initData," +"\n// copyright marazmista @ 12.05.2014\n\n// class for communication with daemon\n\n#ifndef DAEMONCOMM_H\n#define DAEMONCOMM_H\n\n#include \n#include \n#include \n\n#define SEPARATOR '#'\n#define DAEMON_SIGNAL_CONFIG '0'\n#define DAEMON_SIGNAL_READ_CLOCKS '1'\n#define DAEMON_SIGNAL_SET_VALUE '2'\n#define DAEMON_SIGNAL_TIMER_ON '4'\n#define DAEMON_SIGNAL_TIMER_OFF '5'\n#define DAEMON_SHAREDMEM_KEY '6'\n#define DAEMON_ALIVE '7'\n\nclass DaemonComm : public QObject\n{\n Q_OBJECT\n\npublic:\n\n enum ConfirmationMehtod {\n DISABLED,\n ON_REQUEST,\n PERIODICALLY\n };\n\n DaemonComm();\n ~DaemonComm();\n void connectToDaemon();\n void disconnectDaemon();\n void sendCommand(const QString command);\n void setConnectionConfirmationMethod(const ConfirmationMehtod method);\n\n inline bool isConnected() {\n return (signalSender->state() == QLocalSocket::ConnectedState);\n }\n\n inline const QLocalSocket* getSocketPtr() {\n return signalSender;\n }\n\n\npublic slots:\n void receiveFromDaemon();\n void sendConnectionConfirmation();\n\nprivate:\n QDataStream feedback;\n QLocalSocket *signalSender;\n QTimer *confirmationTimer;\n};\n\n#endif // DAEMONCOMM_H" +"---\n# CRD in version v1beta1. Use this for Kubernetes clusters < 1.16\n\n# CRD connecting a ConfigMap with a set of pods which needs to\n# be restarted when the ConfigMap changes\n# CRD connecting a ConfigMap with a set of pods which needs to\n# be restarted when the ConfigMap changes\napiVersion: apiextensions.k8s.io/v1beta1\nkind: CustomResourceDefinition\nmetadata:\n name: configwatchers.k8spatterns.io\nspec:\n scope: Namespaced\n group: k8spatterns.io\n # Additional columns to print when in kubectl get\n additionalPrinterColumns:\n - name: configmap\n description: Name of ConfigMap to watch\n type: string\n JSONPath: .spec.configMap\n - name: podselector\n description: Selector for Pods to restart\n type: string\n JSONPath: .spec.podSelector\n versions:\n - name: v1\n # Enabled\n served: true\n # The version stored in the backend\n storage: true\n names:\n # Kind of this CRD\n kind: ConfigWatcher\n # How to access them via client and REST api\n singular: configwatcher\n plural: configwatchers\n # How to access the CRDs as well (e.g. with \"kubectl get cw\")\n shortNames: [ cw ]\n # Adds Configwatcher to the \"all\" category (e.g. \"kubectl get all\")\n categories: [ all ]\n validation:\n # Validation schema\n openAPIV3Schema:\n properties:\n spec:\n properties:\n configMap:\n type: string\n description: Name of the ConfigMap to monitor for changes\n podSelector:\n type: object\n description: Label selector used for" +"# plex2netflix\n\nThis simple tool checks how much of your media from Plex is available to watch on Netflix, and gives you a nice summary with the percentage of media that is available.\n\nI made this tool because I someday want to make the jump to Netflix, but I want to know beforehand how much of the media I have is available there.\n\nIt works by using the Plex Web API to get a list of all media from given library section. If an item has an IMDb ID (you need to enable the IMDb agent for this in Plex), it uses this to search in the [uNoGS](http://unogs.com/) database which has Netflix availability data. If there is no IMDb ID, the item title and year are used.\n\n[**Powered by uNoGS**](http://unogs.com/).\n\n\"Example\"\n\n## Install\n\nYou need to have [Node.js](https://nodejs.org) (4.0 or higher). Install the tool with the node package manager:\n\n```\nnpm install -g plex2netflix\n```\n\nTo update, just run the command above again.\n\n## Usage\n\nFirst, you need to get your [API token from Plex](https://support.plex.tv/hc/en-us/articles/204059436-Finding-your-account-token-X-Plex-Token).\n\n```\nplex2netflix --host 192.168.0.42 --token=xxx --country=us --section=Movies\n```\n\nBy default it searches the Netflix library of the US. You can specify `--country`" +"--- rec_parse.y.orig\t2001-12-10 17:01:17 UTC\n+++ rec_parse.y\n@@ -32,9 +32,8 @@\n #include \"dmalloc.h\"\n #endif\n \n-static int yyerror(char *err);\n+static int yyerror(rec_t *rec, char *err);\n \n-#define YYPARSE_PARAM rec\n #define YYERROR_VERBOSE\n \n #ifdef REC_PARSE_DEBUG\n@@ -47,6 +46,7 @@ static feature_list_t FEATURE_ERROR = { \n %}\n \n %pure_parser\n+%parse-param { rec_t *rec }\n \n %union {\n int ival;\n@@ -141,7 +141,7 @@ mode_decl\t: MODE STRING\n \t\t| MODE STRING \n \t\t\t{\n \t\t\t /* Do this first so the default mode gets set correctly*/\n-\t\t\t $$ = rec_get_mode((rec_t *) rec, $2);\n+\t\t\t $$ = rec_get_mode((rec_t *) rec, $2);\n \t\t\t}\n \t\t ':' mode_id_list\n \t\t\t{\n@@ -162,12 +162,14 @@ mode_id_list\t: mode_id\n \t\t\t $$ = $1;\n \t\t\t rec_mode_list_append(&$$, $3);\n \t\t\t}\n+\t\t\t;\n \n mode_id\t\t: STRING\n \t\t\t{\n \t\t\t $$ = rec_get_mode((rec_t *) rec, $1);\n \t\t\t free($1);\n \t\t\t}\n+\t\t\t;\n \n gesture_list\t: gesture\n \t\t\t{\n@@ -342,7 +344,7 @@ option\t\t: OPTION STRING STRING\n \t\t\t\n %%\n \n-static int yyerror(char *err)\n+static int yyerror(rec_t *rec, char *err)\n {\n char *loc = rec_lex_location_alloc();\n fprintf(stderr, \"%s: %s\\n\", loc, err);" +"Exercise nbextension history\n----------------------------\n\nUpdate december 30, 2015:\n(@jfbercher) Updated to jupyter notebook 4.1.x\n\nUpdate december 22, 2015:\n(@jfbercher)\n Added the metadata solution_first to mark the beginning of an exercise. It is now possible to have several consecutive exercises. \n\nOctober 21-27,2015: \n(@jfbercher)\n\n1- the extension now works with the multicell API, that is\n - several cells can be selected either via the rubberband extension \n - or via Shift-J (select next) or Shift-K (select previous) keyboard shortcuts\n(probably Shit-up and down will work in a near future) \nNote: previously, the extension required the selected cells to be marked with a \"selected\" key in metadata. This is no more necessary with the new API.\nThen clicking on the toolbar button turns these cells into a \"solution\" which is hidden by default ** Do not forget to keep the Shift key pressed down while clicking on the menu button (otherwise selected cells will be lost)** \n2- the \"state\" of solutions, hidden or shown, is saved and restored at reload/restart. We use the \"solution\" metadata to store the current state.\n3- A small issue (infinite loop when a solution was defined at the bottom edge of the notebook have been corrected)\n4- Added a" +"# sympy/galgebra/manifold.py\n\n\"\"\"\nmanifold.py defines the Manifold class which allows one to create a\nvector manifold (manifold defined by vector field of coordinates in\nembedding vector space) calculate the tangent vectors and derivatives\nof tangent vectors.\n\nOnce manifold is created multivector fields can be constructed in the\ntangent space and all the geometric algebra products and derivatives\nof the multivector fields calculated.\n\nNote that all calculations are done in the embedding space. Future\nversions of the code will allow manifolds defined purely in terms of\na metric.\n\"\"\"\n\nfrom __future__ import print_function\n\nfrom itertools import combinations\nfrom os import system\nimport copy\n\nfrom sympy import trigsimp, simplify\n\nfrom sympy.galgebra.ga import MV\nfrom sympy.galgebra.debug import oprint\nfrom sympy.galgebra.ncutil import linear_expand\nfrom sympy.galgebra.printing import find_executable\n\n\ndef fct_to_str(fct_names):\n import sys\n current_file = open(sys.argv[0], 'r')\n file_str = current_file.read()\n current_file.close()\n\n if isinstance(fct_names, str):\n return fct_names\n\n fcts_str = ''\n\n for fct_name in fct_names:\n start_def = file_str.find('\\ndef ' + fct_name)\n end_def = file_str.find('\\ndef ', start_def + 5)\n start_class = file_str.find('\\nclass ', start_def + 5)\n end_def = min(end_def, start_class)\n fcts_str += file_str[start_def:end_def]\n return fcts_str\n\n\ndef VectorComponents(X, basis):\n (coefs, bases) = linear_expand(X.obj)\n cdict = {}\n for (coef, base) in zip(coefs, bases):\n cdict[str(base)] = coef\n comp = []\n for base" +"name: test\nchannels:\n - conda-forge\ndependencies:\n - python=3.7\n - six\n - pyyaml\n - deprecated\n # testing\n - matplotlib\n - pytest\n - pytest-cov\n - coverage\n - codecov\n # optional\n - geopandas\n - bokeh\n - scikit-learn\n - numba\n - statsmodels\n - scikit-learn\n - geopandas\n - quantecon\n - rtree\n - tqdm\n - quilt3\n - rasterio\n - rasterstats\n - pytest\n - coverage\n - coveralls\n - ipywidgets\n - twine\n - urbanaccess\n - urllib3<1.25\n - wheel\n - dill\n - pytest\n - pytest-cov\n - codecov\n - bokeh\n - libpysal\n - esda\n - spaghetti\n - tobler>=0.2.1\n - spint\n - spreg\n - spvcm\n - mgwr\n - segregation\n - pointpats\n - giddy\n - mapclassify\n - seaborn\n - descartes\n - numba\n - inequality\n - access\n - splot" +"# Using Distillery with systemd\n\n!!! warning\n You need to be aware that when running `bin/myapp upgrade `,\n this command will be executed in the *callers* environment, not\n the environment defined by the systemd unit. If you need environment\n variables to be available during the upgrade, then you need to either\n execute it with the same environment as the systemd unit, or export those\n environment variables in the calling environment.\n\nHere are two general approaches to running a Distillery release with systemd:\n\n## Run app as daemon using `start` and a `forking` Systemd service *with* pidfile\n\nProperties of this approach:\n\n * Your app will be automatically restarted if it crashes\n * Logs will be written to the `var/log` directory in your release\n * If your app is killed suddenly (on Linux this would mean receiving a `SIGKILL`,as used by \"OOM Killer\") then your app may not get a chance to remove the pidfile and so it may not be restarted. If this is a concern kill your app with `pkill -9 beam.smp` and ensure that the pifile is removed and that the systemd detects strats it again.\n\n```systemd\n[Unit]\nDescription=My App\nAfter=network.target\n\n[Service]\nType=forking\nUser=appuser\nGroup=appuser\nWorkingDirectory=/home/appuser/myapp\nExecStart=/home/appuser/myapp/bin/myapp start\nExecStop=/home/appuser/myapp/bin/myapp" +"\ufeffusing Sandbox.Game.Debugging;\nusing Sandbox.Game.World;\n\nnamespace Sandbox.Engine.Utils\n{\n public static class MyFpsManager\n {\n static long m_lastTime = 0;\n static uint m_fpsCounter = 0;\n static uint m_sessionTotalFrames = 0;\n static uint m_maxSessionFPS = 0;\n static uint m_minSessionFPS = int.MaxValue;\n\n static uint m_lastFpsDrawn = 0;\n\n static long m_lastFrameTime = 0;\n static long m_lastFrameMin = long.MaxValue;\n static long m_lastFrameMax = long.MinValue;\n static byte m_firstFrames = 0;\n\n // Returns FPS once per second. We can't display actual FPS at every frame, because it will be changing quickly and so unreadable.\n public static int GetFps()\n {\n return (int)m_lastFpsDrawn;\n }\n\n public static int GetSessionTotalFrames()\n {\n return (int)m_sessionTotalFrames;\n }\n\n public static int GetMaxSessionFPS()\n {\n return (int)m_maxSessionFPS;\n }\n\n public static int GetMinSessionFPS()\n {\n return (int)m_minSessionFPS;\n }\n\n /// \n /// Returns update + render time of last frame in ms\n /// \n /// \n public static float FrameTime { get; private set; }\n\n /// \n /// Returns update + render time of last frame (Average in last second)\n /// \n /// \n public static float FrameTimeAvg { get; private set; }\n public static float FrameTimeMin { get; private set; }\n public static float FrameTimeMax { get; private set; }\n\n public static void Update()\n {\n m_fpsCounter++;\n m_sessionTotalFrames++;\n\n if (MySession.Static == null)" +" 'e13a9d',\n self::KEY_DESCRIPTION_BACKGROUND_COLOR => '9b45e4',\n ],\n [\n self::KEY_NAME_BACKGROUND_COLOR => '222831',\n self::KEY_DESCRIPTION_BACKGROUND_COLOR => '393e46',\n ],\n [\n self::KEY_NAME_BACKGROUND_COLOR => 'da2d2d',\n self::KEY_DESCRIPTION_BACKGROUND_COLOR => '9d0b0b',\n ],\n [\n self::KEY_NAME_BACKGROUND_COLOR => '6f9a8d',\n self::KEY_DESCRIPTION_BACKGROUND_COLOR => '1f6650',\n ],\n [\n self::KEY_NAME_BACKGROUND_COLOR => 'd1274b',\n self::KEY_DESCRIPTION_BACKGROUND_COLOR => '3d0e1e',\n ],\n [\n self::KEY_NAME_BACKGROUND_COLOR => '71a95a',\n self::KEY_DESCRIPTION_BACKGROUND_COLOR => '007944',\n ],\n [\n self::KEY_NAME_BACKGROUND_COLOR => 'e3b04b',\n self::KEY_DESCRIPTION_BACKGROUND_COLOR => '2b2b28',\n ],\n [\n self::KEY_NAME_BACKGROUND_COLOR => 'f6ad7b',\n self::KEY_DESCRIPTION_BACKGROUND_COLOR => 'be7575',\n ],\n [\n self::KEY_NAME_BACKGROUND_COLOR => 'a34a28',\n self::KEY_DESCRIPTION_BACKGROUND_COLOR => '211717',\n ],\n [\n self::KEY_NAME_BACKGROUND_COLOR => 'fc7fb2',\n self::KEY_DESCRIPTION_BACKGROUND_COLOR => '45454d',\n ],\n ];\n\n}" +"// SPDX-License-Identifier: Apache-2.0\n\npackage firrtl.passes\npackage clocklist\n\nimport firrtl._\nimport firrtl.ir._\nimport wiring.Lineage\nimport Utils._\nimport memlib.AnalysisUtils._\n\nobject ClockListUtils {\n\n /** Returns a list of clock outputs from instances of external modules\n */\n def getSourceList(moduleMap: Map[String, DefModule])(lin: Lineage): Seq[String] = {\n val s = lin.foldLeft(Seq[String]()) {\n case (sL, (i, l)) =>\n val sLx = getSourceList(moduleMap)(l)\n val sLxx = sLx.map(i + \"$\" + _)\n sL ++ sLxx\n }\n val sourceList = moduleMap(lin.name) match {\n case ExtModule(i, n, ports, dn, p) =>\n val portExps = ports.flatMap { p => create_exps(WRef(p.name, p.tpe, PortKind, to_flow(p.direction))) }\n portExps.filter(e => (e.tpe == ClockType) && (flow(e) == SinkFlow)).map(_.serialize)\n case _ => Nil\n }\n val sx = sourceList ++ s\n sx\n }\n\n /** Returns a map from instance name to its clock origin.\n * Child instances are not included if they share the same clock as their parent\n */\n def getOrigins(\n connects: Connects,\n me: String,\n moduleMap: Map[String, DefModule]\n )(lin: Lineage\n ): Map[String, String] = {\n val sep = if (me == \"\") \"\" else \"$\"\n // Get origins from all children\n val childrenOrigins = lin.foldLeft(Map[String, String]()) {\n case (o, (i, l)) =>\n o ++ getOrigins(connects, me + sep + i, moduleMap)(l)\n }\n // If I have a clock," +"Britain's Britannic Assurance declared on Tuesday a 209 million pound ($336 million) special bonus for life insurance policies, following discussions with the government on ownership of surplus insurance funds.\r\nThe payout will apply to all \"with profits\" policies in force on February 17, 1997. Details will be given with the 1996 bonus due to be announced next month.\r\nThe company said last year it discussing with the Department of Trade and Industry (DTI) ownership of long-term assets, and a way to distribute the surplus to policyholders and shareholders.\r\nBritannic has now agreed with the DTI that 902 million pounds of the excess assets within the long term fund can be attributable to shareholders.\r\nBritannic also said it intended to increase its dividend for the year by 82 percent to 23 pence per share following an increase in life profits. This would be the basis for continuing the company's progressive dividend policy.\r\nThe news boosted Britannic shares, which jumped 71 pence, or nearly nine percent, at one stage before settling for a rise of 29 pence at 832.5.\r\nThe money attributable to shareholders forms part of total assets in the life funds which amounted to 5.682 billion pounds at the end" +"(ns enliven.core.actions\n (:refer-clojure :exclude [replace]))\n\n;; An action is made of an op, a scope-idx, an arg and\n\n(defn replace [path] {:op ::replace :scope-idx 0 :arg path})\n(defn dup [path sub] {:op ::dup :scope-idx 0 :arg path :subs [[list sub]]})\n\n(defmulti update (fn [action key f & args] key))\n\n(defmethod update :default [action key f & args]\n (assoc action key (mapv #(apply f % args) (get action key))))\n\n#_(defmethod update :subs [action key f & args]\n (assoc action :subs (mapv (fn [[p sub]] [p (apply f sub args)]) (:subs action))))\n\n(defmethod update :arg [action key f & args]\n (if-let [path (:arg action)]\n (assoc action :arg (apply f path args))\n action))" +"# Writing actions\n\n## What is an action?\n\nAn action is basically what a controller in an MVC architecture is. It is the glue that interacts with various services and other actions to achieve some business-specific task.\n\nUsing actions is optional (unless you are writing a component which you want other people to use), but they are a great place to put re-usable pieces of logic. You can then call your actions from an http server, or from a CLI interface or from an interactive REPL session. If you put all this logic into an http route handler, you would not be able to easily execute it from other places.\n\nSome common traits of an action:\n\n- They usually interact with many other components (like services or other actions)\n- They frequently contain business-specific logic or accomplish a very specific task\n- They do not need to keep any kind of state (they are stateless - it's just input params->returning results)\n\n> **NOTE**: Contrary to services and hooks, the Action class you implement **is** what you will interface with - when you register an Action into the Atlas instance, the class is instantiated and exposed to you via `atlas.actions.*`.\n\n##" +"# Colourama\n\nColourama used typosquatting to register a package that had similar name to\nColorama, one of is one of the top 20 most downloaded legitimate modules\nin the PyPI registry with 1 million downloads on a daily basis. The colourama\npackage contains a malware which targets Windows machines to implement a\ncryptocurrency clipboard hijacker. As a result, was able to divert any\nBitcoin payment from victim machines to the attacker's bitcoin address.\n\n## Impact\n\nColourama was registered early in December 2017. It is not clear how many times\nthe malicious package have been downlaoded since then. According to a report by\nMedium, it was downloaded 55 times in October 2018.\n\n## Type of compromise\n\nA typosquat attack does not require compromising any type of infrastructure." +"@validation\nFeature: parameter validation\n Scenario: identifier is required with failover\n When I execute \"cli53 rrcreate --failover PRIMARY $domain 'a A 127.0.0.1'\"\n Then the exit code was 1\n\n Scenario: identifier is required with weight\n When I execute \"cli53 rrcreate --weight 10 $domain 'a A 127.0.0.1'\"\n Then the exit code was 1\n\n Scenario: identifier is required with region\n When I execute \"cli53 rrcreate --region us-west-1 $domain 'a A 127.0.0.1'\"\n Then the exit code was 1\n\n Scenario: identifier alone is invalid\n When I execute \"cli53 rrcreate -i id $domain 'a A 127.0.0.1'\"\n Then the exit code was 1\n\n Scenario: failover must be PRIMARY/SECONDARY\n When I execute \"cli53 rrcreate -i id --failover JUNK $domain 'a A 127.0.0.1'\"\n Then the exit code was 1\n\n Scenario: failover and weight are mutually exclusive\n When I execute \"cli53 rrcreate -i id --failover PRIMARY --weight 10 $domain 'a A 127.0.0.1'\"\n Then the exit code was 1\n\n Scenario: passing --append and --replace at the same time makes no sense\n When I execute \"cli53 rrcreate --append --replace $domain 'a A 127.0.0.2'\"\n Then the exit code was 1\n\n Scenario: create requires one argument\n When I execute \"cli53 create a b\"\n Then the exit code was 1\n\n Scenario: delete requires one argument\n When" +"\ufeffusing DotNet;\r\n\r\nusing Nemerle.Collections;\r\n\r\nusing Nitra.AstUtils;\r\nusing Nitra.Declarations;\r\nusing Nitra.Utils;\r\n\r\nusing System.Drawing;\r\n\r\nusing R = Nitra.Ast.RegexExpressions;\r\n\r\nnamespace Nitra.Ast.RegexExpressions\r\n{\r\n abstract ast Expression : BindableAst\r\n {\r\n }\r\n\r\n abstract ast Unary : R.Expression\r\n {\r\n Expression.Scope = Scope;\r\n Expression : R.Expression;\r\n }\r\n\r\n abstract ast Binary : R.Expression\r\n {\r\n Expression1.Scope = Scope;\r\n Expression2.Scope = Scope;\r\n\r\n Expression1 : R.Expression;\r\n Expression2 : R.Expression;\r\n }\r\n\r\n abstract ast List : R.Unary\r\n {\r\n Expressions.Scope = Scope;\r\n\r\n Expressions : R.Expression*;\r\n }\r\n\r\n ast Sequence : R.List { }\r\n ast Choice : R.List { }\r\n ast Subtract : R.Binary { }\r\n ast Optional : R.Unary { }\r\n ast Repeat : R.Unary { }\r\n ast RepeatWithSeparator : R.Unary\r\n {\r\n Separator.Scope = Scope;\r\n Separator : R.Expression;\r\n }\r\n\r\n ast Call : R.Expression\r\n {\r\n RuleReference.Scope = Scope;\r\n\r\n RuleReference : QualifiedReference;\r\n }\r\n\r\n ast Char : R.Expression { Literal : CharLiteral; }\r\n ast String : R.Expression { Literal : StringLiteral; }\r\n ast Range : R.Expression { }\r\n ast InvertedRange : R.Expression { }\r\n}" +"\nMISSPELLED WORDS\n\nJonze\nINT\nTHEODORE'S\nparents'\npg\nO\nS\nBeautifulhandwrittenletters\npg\nINT\nTHEODORE'S\nBadass\nBuh\nINT\nTHEODORE'S\npg\nemails\nLewman's\nmopey\npg\nemails\nINT\nhandheld\nbetw\nINT\nINT\nTHEODORE'S\nINT\nTHEODORE'S\npg\nINT\nTHEODORE'S\nburrito\nD\nINT\nTHEODORE'S\nINT\nTHEODORE'S\nINT\nTHEODORE'S\npg\nSEXYKITTEN\nSexyKitten\nSEXYKITTEN\nBigGuy\nSEXYKITTEN\nstudmuffin\npg\nSEXYKITTEN\nsexykitten\nSEXYKITTEN\nMmm\nUm\nSEXYKITTEN\nSEXYKITTEN\nSEXYKITTEN\npg\nSEXYKITTEN\nUm\nSEXYKITTEN\nSEXYKITTEN\nSEXYKITTEN\nSEXYKITTEN\nSEXYKITTEN\nAHHHHHHHHHHHH\nSEXYKITTEN\npg\nSEXYKITTEN\nINT\nelectronica\nINT\nTHEODORE'S\npg\nMmm\nUh\num\npg\nwhat'll\nSamantha\nSAMANTHA\npg\nSAMANTHA\nSamantha\nSAMANTHA\nSAMANTHA\nhundredths\nSAMANTHA\nSAMANTHA\npg\nSAMANTHA\nSAMANTHA\nSAMANTHA\nSAMANTHA\nSAMANTHA\nSAMANTHA\nTheodore's\nSAMANTHA\nUm\npg\nD\nSAMANTHA\nemails\nemails\nSamantha\nSAMANTHA\nSAMANTHA\nSAMANTHA\nSAMANTHA\npg\nINT\nTHEODORE'S\nSAMANTHA\nUm\nSAMANTHA\nSAMANTHA\nSAMANTHA\nTheodore's\nSAMANTHA\nSAMANTHA\npg\nSAMANTHA\nSamantha\nTheodore's\nSAMANTHA\nTheodore's\nSAMANTHA\nSAMANTHA\npg\nSAMANTHA\nSAMANTHA\nSAMANTHA\nINT\nTHEODORE'S\nTheo\nUh\num\npg\nINT\nTHEODORE'S\npg\nINT\nTHEODORE'S\nTheodore's\nSAMANTHA\nSAMANTHA\npg\nUh\nSAMANTHA\nSAMANTHA\nsoooo\nTheodore's\nfuckface\nfuckhead\nshitface\nfuckhead\nSAMANTHA\npg\nfuckhead\nfarts\nSAMANTHA\nLewman\nSAMANTHA\nLewman\nSAMANTHA\ngoddaughter's\npg\nSAMANTHA\nSAMANTHA\nlaude\nSAMANTHA\nSAMANTHA\nemails\nSAMANTHA\nSAMANTHA\nSamantha\npg\nSAMANTHA\nSAMANTHA\nSAMANTHA\nSAMANTHA\nSAMANTHA\nSamantha\npg\nSAMANTHA\nSAMANTHA\nSamantha\nsnickers\nINT\npg\nTheo's\nINT\nINT\npg\nTheodore's\nSAMANTHA\nSAMANTHA\nemails\npg" +"using UnityEngine;\nusing System.Collections;\n\n[RequireComponent (typeof (ThirdPersonController))]\npublic class AnimationController : MonoBehaviour\n{\n\tenum CharacterState\n\t{\n\t\tNormal,\n\t\tJumping,\n\t\tFalling,\n\t\tLanding\n\t}\n\t\n\t\n\tpublic Animation target;\n\t\t// The animation component being controlled\n\tnew public Rigidbody rigidbody;\n\t\t// The rigidbody movement is read from\n\tpublic Transform root, spine, hub;\n\t\t// The animated transforms used for lower body rotation\n\tpublic float\n\t\twalkSpeed = 0.2f,\n\t\trunSpeed = 1.0f,\n\t\t\t// Walk and run speed dictate at which rigidbody velocity, the animation should blend\n\t\trotationSpeed = 6.0f,\n\t\t\t// The speed at which the lower body should rotate\n\t\tshuffleSpeed = 7.0f,\n\t\t\t// The speed at which the character shuffles his feet back into place after an on-the-spot rotation\n\t\trunningLandingFactor = 0.2f;\n\t\t\t// Reduces the duration of the landing animation when the rigidbody has hoizontal movement\n\t\n\t\n\tprivate ThirdPersonController controller;\n\tprivate CharacterState state = CharacterState.Falling;\n\tprivate bool canLand = true;\n\tprivate float currentRotation;\n\tprivate Vector3 lastRootForward;\n\t\n\t\n\tprivate Vector3 HorizontalMovement\n\t{\n\t\tget\n\t\t{\n\t\t\treturn new Vector3 (rigidbody.velocity.x, 0.0f, rigidbody.velocity.z);\n\t\t}\n\t}\n\t\n\t\n\tvoid Reset ()\n\t// Run setup on component attach, so it is visually more clear which references are used\n\t{\n\t\tSetup ();\n\t}\n\t\n\t\n\tvoid Setup ()\n\t// If target or rigidbody are not set, try using fallbacks\n\t{\n\t\tif (target" +" 'Import',\n 'start_import' => 'Start Import',\n 'import_running' => 'Import running...',\n 'import_file' => 'File for Import',\n\n 'import_help' => 'You can import your existing browser bookmarks here. Usually, bookmarks are exported into an .html file by your browser. Select the file here and start the import.
Depending on the number of bookmarks this process may take some time.',\n\n 'import_networkerror' => 'Something went wrong while trying to import the bookmarks. Please check your browser console for details or consult the application logs.',\n 'import_error' => 'Something went wrong while trying to import the bookmarks. Please consult the application logs.',\n 'import_empty' => 'Could not import any bookmarks. Either the uploaded file is corrupt or empty.',\n 'import_successfully' => ':imported links imported successfully, :skipped skipped.',\n];" +"# Projects\n\nAs mentioned in the previous section, a project in Livekeys consists of one or more Qml files or components, out of\nwhich one is the main file or the one that's being executed. Livekeys treats these as 2 types of projects: file\nbased and folder based. The project directory can be accessed whithin qml, by using the\n`project.dir()` expression:\n\n```\nimport QtQuick 2.3\n\nText{\n text: 'Project path :' + project.dir()\n}\n```\n\nThis helps the access keep files relative to the project folder. For projects that have a single file, the\n`project.dir()` expression returns the path to the directory the file is located in.\n\nAs a rule within project based files, qml files that start with a capital letter are used as components within\nthat project folder, since they define new types, and files that start with a lowercase letter are mainly used for\nexecution like sample files, tests or main.qml files:\n\n```\n|- MyComponent.qml\n|- samples/myComponentUsage.qml\n|- tests/myComponentTest.qml\n|- main.qml\n```\n\n## Active Files\n\nWhen opening a project folder, Livekeys expects a similar structure to the above to be present. It will look to set\nthe active file automatically, by first looking for the main.qml, then looking" +"---\ntitle: \"Data Objects and Data Sources: Manipulation\"\nms.date: \"11/04/2016\"\nhelpviewer_keywords: [\"data objects [MFC], manipulating\", \"data sources [MFC], data operations\", \"data sources [MFC], inserting data\", \"Clipboard [MFC], determining available formats\", \"OLE [MFC], data objects\", \"Clipboard [MFC], passing format information\", \"data sources [MFC], determining available formats\", \"delayed rendering [MFC]\", \"OLE [MFC], data sources\"]\nms.assetid: f7f27e77-bb5d-4131-b819-d71bf929ebaf\n---\n# Data Objects and Data Sources: Manipulation\n\nAfter a data object or data source has been created, you can perform a number of common operations on the data, such as inserting and removing data, enumerating the formats the data is in, and more. This article describes the techniques necessary to complete the most common operations. Topics include:\n\n- [Inserting data into a data source](#_core_inserting_data_into_a_data_source)\n\n- [Determining the formats available in a data object](#_core_determining_the_formats_available_in_a_data_object)\n\n- [Retrieving data from a data object](#_core_retrieving_data_from_a_data_object)\n\n## Inserting Data into a Data Source\n\nHow data is inserted into a data source depends on whether the data is supplied immediately or on demand, and in which medium it is supplied. The possibilities are as follows.\n\n### Supplying Data Immediately (Immediate Rendering)\n\n- Call `COleDataSource::CacheGlobalData` repeatedly for every Clipboard format in which you are supplying data. Pass the Clipboard format to" +"param(\n\t[Parameter(Mandatory=$true)]\n\t[ValidateSet(\"GTA\",\"TAfGT\")]\n\t[string] $flavor\n)\n\n$solutionFile = \".\\GoogleTestAdapter\\GoogleTestAdapter.sln\"\n$gta_guids = @(\n \"E6276CAD-E4C3-4B25-876A-65B265EBFF1A\",\n \"17F4B73F-E4D3-4E40-98FC-788B1D0F8225\",\n \"87F26371-0005-4301-9C49-A6DF4F06B81C\",\n \"4735D8CC-FA30-432D-854C-2984A7DA5DD2\"\n)\n$tafgt_guids = @(\n\t\"55294B5F-A075-43F2-B0E9-2B11925E8B91\",\n\t\"9041BDED-FA1B-4C17-B7EA-7B750C470C23\",\n\t\"B3AEAD11-8EA3-4AB0-9DB0-643BFAAEB9B2\",\n\t\"483FE0C7-4E8D-4591-BE45-EAC6B2EA5F4F\"\n)\n\n$guids = if ($flavor -eq \"GTA\") { $tafgt_guids } else { $gta_guids }\n$guids_regex = [string]::Join('|', $guids)\n\n$is_processing_project = $false\n(Get-Content $solutionFile) | ForEach-Object {\n if ($is_processing_project) {\n if ($_ -like \"EndProject\") {\n $is_processing_project = $false\n }\n } elseif ($_ -match $guids_regex) {\n if ($_ -like \"Project*\") {\n $is_processing_project = $true\n }\n } else {\n # Add the line to the new sln file if it isn't related to one of our projects\n $_\n }\n} | Set-Content $solutionFile" +"* Broadcom BCM7xxx Ethernet Systemport Controller (SYSTEMPORT)\n\nRequired properties:\n- compatible: should be one of:\n\t \"brcm,systemport-v1.00\"\n\t \"brcm,systemportlite-v1.00\" or\n\t \"brcm,systemport\"\n- reg: address and length of the register set for the device.\n- interrupts: interrupts for the device, first cell must be for the rx\n interrupts, and the second cell should be for the transmit queues. An\n optional third interrupt cell for Wake-on-LAN can be specified\n- local-mac-address: Ethernet MAC address (48 bits) of this adapter\n- phy-mode: Should be a string describing the PHY interface to the\n Ethernet switch/PHY, see Documentation/devicetree/bindings/net/ethernet.txt\n- fixed-link: see Documentation/devicetree/bindings/net/fixed-link.txt for\n the property specific details\n\nOptional properties:\n- systemport,num-tier2-arb: number of tier 2 arbiters, an integer\n- systemport,num-tier1-arb: number of tier 1 arbiters, an integer\n- systemport,num-txq: number of HW transmit queues, an integer\n- systemport,num-rxq: number of HW receive queues, an integer\n\nExample:\nethernet@f04a0000 {\n\tcompatible = \"brcm,systemport-v1.00\";\n\treg = <0xf04a0000 0x4650>;\n\tlocal-mac-address = [ 00 11 22 33 44 55 ];\n\tfixed-link = <0 1 1000 0 0>;\n\tphy-mode = \"gmii\";\n\tinterrupts = <0x0 0x16 0x0>,\n\t\t<0x0 0x17 0x0>;\n};" +"# coding: utf-8\n\n# https://forum.omz-software.com/topic/3039/share-draw-text-in-a-circle-not-earth-shattering/2\n\nimport ui\n# Pythonista Forum - @Phuket2\n# for @ccc , should be pep8 ok and pyflakes :)\n\n# No break through here. just expanded on the Pythonista help\n# code for ui.ImageContext\n\n\n# draw text in a circle, and return a ui.image\ndef text_in_circle(r,\n text,\n text_color = 'white',\n circle_color = 'teal',\n circle_alpha = 1.0,\n font_name = 'Arial Rounded MT Bold',\n inset_percent = 0):\n\n\t'''\n\ttext_in_circle - * denotes a param\n\t==============\n\t*r-ui.Rect or tuple (0, 0, 0, 0) - the bounding rect for the circle.\n\t\n\t*text-text to draw in the circle\n\t\n\t*text_color-color of the text drawn inside the circle\n\t\n\t*circle_color-color of the circle\n\t\n\t*circle_alpha-alpha setting applied to circle color. Note, did this\n\tfor a reason. easier to use string names for colors!\n\t\n\t*font_name-the font used to render the *text\n\t\n\t*inset_percent-reduces *r by a percentage for l,t,w,h for possible\n\tbetter placement of the text inside the circle. a.k.a margin\n\t\n\tRETURNS - a rendered uiImage\n\t\n\t'''\n\t\n\t# this inner function does not need to be here, was just to keep it\n\t# all together\n\tdef get_max_fontsize(r, text, font_name, inset_rect = ui.Rect()):\n\t\tr1 = ui.Rect(*r).inset(*inset_rect)\n\t\tfor i in xrange(5, 1000):\n\t\t\tw, h = ui.measure_string(text, max_width=0,\n\t\t\tfont=(font_name, i)," +"final: prev:\nlet\n callCabal2Nix = compiler-nix-name: name: src: final.evalPackages.stdenv.mkDerivation {\n name = \"${name}-package.nix\";\n inherit src;\n nativeBuildInputs = [\n # It is not safe to check the nix-tools materialization here\n # as we would need to run this code to do so leading to\n # infinite recursion (so using nix-tools-unchecked).\n final.evalPackages.haskell-nix.nix-tools-unchecked.${compiler-nix-name}\n ];\n phases = [ \"unpackPhase\" \"buildPhase\" ];\n\n LOCALE_ARCHIVE = final.lib.optionalString (final.stdenv.hostPlatform.libc == \"glibc\") \"${final.glibcLocales}/lib/locale/locale-archive\";\n LANG = \"en_US.UTF-8\";\n LC_ALL = \"en_US.UTF-8\";\n\n buildPhase = ''\n cabal-to-nix *.cabal > $out\n '';\n };\n\n # Combines multiple derivations into one to make them\n # easier to materialize.\n # Using `cp -Lr` here follows the symlinks and prevents\n # `access to path is forbidden in restricted mode`\n # errors on hydra when the materialized files are not present.\n combineFiles = name: ext: files:\n let links = final.linkFarm name\n (final.lib.mapAttrsToList (name: path: {\n name = name + ext;\n inherit path;\n }) files);\n in final.evalPackages.runCommand \"${name}${ext}\" {} ''\n cp -Lr ${links} $out\n chmod -R +w $out\n '';\n\n # Combine the all the boot package nix files for a given ghc\n # into a single derivation and materialize it.\n combineAndMaterialize = ghcName: bootPackages:\n let\n # Not all the boot packages for ghc 8.8 and above can be\n # processed" +"/* SPDX-License-Identifier: GPL-2.0 */\n/*\n * Copyright (C) Microsoft Corporation\n */\n\n#ifndef __TPM2_FTPM_TEE_H__\n#define __TPM2_FTPM_TEE_H__\n\n/* This UUID is generated with uuidgen */\n#define TA_FTPM_UUID { 0xBC50D971, 0xD4C9, 0x42C4, \\\n\t{0x82, 0xCB, 0x34, 0x3F, 0xB7, 0xF3, 0x78, 0x96} }\n\n/* The TAFs ID implemented in this TA */\n#define FTPM_OPTEE_TA_SUBMIT_COMMAND (0)\n#define FTPM_OPTEE_TA_EMULATE_PPI (1)\n\n/* max. buffer size supported by fTPM */\n#define MAX_COMMAND_SIZE 4096\n#define MAX_RESPONSE_SIZE 4096\n\n/**\n * struct ftpm_tee_private - fTPM's private context\n * @tee_dev: struct udevice for TEE.\n * @session: fTPM TA session identifier.\n * @is_open: Indicates whether the driver is already opened by client or not.\n * @shm: Memory pool shared with fTPM TA in TEE.\n */\nstruct ftpm_tee_private {\n\tstruct udevice *tee_dev;\n\tu32 session;\n\tint is_open;\n\tstruct tee_shm *shm;\n};\n\n#endif /* __TPM2_FTPM_TEE_H__ */" +"# frozen_string_literal: true\n\nclass Pry\n class Command\n class Ls < Pry::ClassCommand\n module MethodsHelper\n include Pry::Command::Ls::JRubyHacks\n\n private\n\n # Get all the methods that we'll want to output.\n def all_methods(instance_methods = false)\n methods = if instance_methods || @instance_methods_switch\n Pry::Method.all_from_class(@interrogatee)\n else\n Pry::Method.all_from_obj(@interrogatee)\n end\n\n if Pry::Helpers::Platform.jruby? && !@jruby_switch\n methods = trim_jruby_aliases(methods)\n end\n\n methods.select { |method| @ppp_switch || method.visibility == :public }\n end\n\n def resolution_order\n if @instance_methods_switch\n Pry::Method.instance_resolution_order(@interrogatee)\n else\n Pry::Method.resolution_order(@interrogatee)\n end\n end\n\n def format(methods)\n methods.sort_by(&:name).map do |method|\n if method.name == 'method_missing'\n color(:method_missing, 'method_missing')\n elsif method.visibility == :private\n color(:private_method, method.name)\n elsif method.visibility == :protected\n color(:protected_method, method.name)\n else\n color(:public_method, method.name)\n end\n end\n end\n end\n end\n end\nend" +"class Service::MqttPub < Service\n self.title = 'MQTT publish'\n\n string :broker, :port, :topic, :clientid, :user\n password :pass\n boolean :retain\n\n require 'mqtt'\n\n def receive_push\n\n # Optional - use m2m.io public broker if not specified\n broker = data['broker'].to_s\n if broker == ''\n broker = 'q.m2m.io'\n end\n\n # Optional - use standard MQTT port if not specified\n port = data['port'].to_i\n if data['port'].to_s == ''\n port = 1883\n end\n\n if data['topic'].to_s == ''\n raise_config_error \"Invalid topic. Try github// .\"\n end\n\n # Optional - generate random epoch for ID if not specified\n clientid = data['clientid'].to_s\n if clientid == ''\n # Random ID doesn't make sense, but use prefix like MQTT::generate_client_id\n clientid = 'github_' + Time.now.to_i.to_s\n end\n\n # Optional, specify nil if not specified (per http://rubydoc.info/gems/mqtt/MQTT/Client)\n user = data['user'].to_s\n if user == ''\n user = nil\n end\n\n # Optional, specify nil if not specified\n pass = data['pass'].to_s\n if pass == ''\n pass = nil\n end\n\n # Handle specifying a username or a password, but not both\n if user != nil and pass == nil\n raise_config_error \"You specified a username without a password.\"\n end\n\n if pass != nil and user == nil\n raise_config_error \"You specified a password without a username.\"\n end\n\n begin\n # Connect to the broker, publish" +"#define MEM_CHUNK 10000\n#define TRY_KEYS 50\n\n// Number of trailers == number of sectors\n// Mifare Classic 1k 16x64b = 16\n#define NR_TRAILERS_1k (16)\n// Mifare Classic Mini\n#define NR_TRAILERS_MINI (5)\n// Mifare Classic 4k 32x64b + 8*256b = 40\n#define NR_TRAILERS_4k (40)\n// Mifare Classic 2k 32x64b\n#define NR_TRAILERS_2k (32)\n\n// Number of blocks\n// Mifare Classic 1k\n#define NR_BLOCKS_1k 0x3f\n// Mifare Classic Mini\n#define NR_BLOCKS_MINI 0x13\n// Mifare Classic 4k\n#define NR_BLOCKS_4k 0xff\n// Mifare Classic 2k\n#define NR_BLOCKS_2k 0x7f\n\n#define MAX_FRAME_LEN 264\n\n// Used for counting nonce distances, explore [nd-value, nd+value]\n#define DEFAULT_TOLERANCE 20\n\n// Default number of distance probes\n#define DEFAULT_DIST_NR 15\n\n// Default number of probes for a key recovery for one sector\n#define DEFAULT_PROBES_NR 150\n\n// Number of sets with 32b keys\n#define DEFAULT_SETS_NR 5\n\n#define odd_parity(i) (( (i) ^ (i)>>1 ^ (i)>>2 ^ (i)>>3 ^ (i)>>4 ^ (i)>>5 ^ (i)>>6 ^ (i)>>7 ^ 1) & 0x01)\n\ntypedef struct {\n uint8_t KeyA[6];\n uint8_t KeyB[6];\n bool foundKeyA;\n bool foundKeyB;\n uint8_t trailer; // Value of a trailer block\n} sector;\n\ntypedef struct {\n uint32_t *distances;\n uint32_t median;\n uint32_t num_distances;\n uint32_t tolerance;\n uint8_t parity[3]; // used for 3 bits of parity information\n} denonce; // Revealed" +"#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n\"\"\"\nA pure-Python module for identifying and examining RAR files developed without\nany exposure to the original unrar code. (Just format docs from wotsit.org)\n\nIt was, however, influenced by the zipfile module in the Python standard\nlibrary as, having already decided to match the zipfile.ZipFile API as closely\nas feasibly possible, I didn't see a point to doing extra work to come up with\nnew ways of laying out my code for no good reason.\n\n@todo: Determine how rarfile (http://rarfile.berlios.de/) compares to this in\nvarious target metrics. If it is superior or close enough on all fronts,\npatch it as necessary and plan a migration path. Otherwise, do the following:\n - Complete the parsing of the RAR metadata.\n (eg. Get data from archive header, check CRCs, read cleartext comments, etc.)\n - Optimize further and write a test suite.\n - Double-check that ZipFile/ZipInfo API compatibility has been maintained\n wherever feasible.\n - Support extraction of files stored with no compression.\n - Look into supporting split and password-protected RARs.\n - Some password-protected RAR files use blocks with types 0x30, 0x60, and 0xAD\n according to this code. Figure out whether it's a bug or whether they're really" +"#!/usr/bin/env bash\n\n# Creates DeepCpG data files.\n\n\n# Source dependencies.\nsource \"./lib.sh\"\n\n# Set to 1 for testing and 0 for real run.\ntest_mode=1\n# Directory with CpG profiles.\ncpg_dir=\"../data/cpg\"\n# Directory with DNA sequences.\ndna_dir=\"../data/dna/mm10\"\n\n# Create data files.\ncmd=\"dcpg_data.py\n --cpg_profiles $cpg_dir/*.tsv\n --dna_files $dna_dir\n --out_dir $data_dir\n --dna_wlen 1001\n --cpg_wlen 50\n \"\nif [[ $test_mode -eq 1 ]]; then\n cmd=\"$cmd --nb_sample 1000\"\nfi\nrun $cmd\n\n# Compute statistics, e.g. the total number of CpG sites and the mean\n# methylation rate of each cell. Change the input `./data/*` to\n# `./data/c{1,3,5}*.h5` to compute statistics for a subset of the data, which is\n# useful for deciding how to split the data into training, validation, and test\n# set.\ncmd=\"dcpg_data_stats.py $data_dir/* | tee $data_dir.txt\"\nrun $cmd" +"# S6:2017 Security Misconfiguration\n\n## Attack Vectors\n\nUnused pages are replaced with unlinked triggers, unprotected files and directories are changed to public resources, like public buckets. Attackers will try to identify misconfigured functions with long timeout or low concurrency limit in order to cause a Denial of Service (DoS). Additionally, functions which contain unprotected secrets like keys and token in the code or the environment could eventually result in sensitive information leakage.\n\n## Security Weakness\n\nServerless reduces the need to to patch the environment, since we do not control the infrastructure. However, in many cases the biggest weakness is human error. Secrets could be [accidently uploaded to the github repo](https://www.forbes.com/sites/runasandvik/2014/01/14/attackers-scrape-github-for-cloud-service-credentials-hijack-account-to-mine-virtual-currency/), put it on a public bucket or even used hardcoded in the function.\n\nAdditionally, functions with long timeout configuration give an attacker the opportunity to make their exploit last longer or just cause an increased charge for the function execution.\n\nMoreover, functions with low concurrency limit could lead into a DoS attack, while functions with high concurrency limit could result in a Denial of Wallet (see [Other Risks](0xab-other-risks.md) section)\n\n## Impact\n\nMisconfiguration could lead to sensitive information leakage, money loss, DoS or in severe cases, unauthorized access to cloud resources." +"package xyz.nulldev.androidcompat.service\n\nimport android.app.Service\nimport android.content.Context\nimport android.content.Intent\nimport mu.KotlinLogging\nimport java.util.concurrent.ConcurrentHashMap\nimport kotlin.concurrent.thread\n\n/**\n * Service emulation class\n *\n * TODO Possibly handle starting services via bindService\n */\n\nclass ServiceSupport {\n val runningServices = ConcurrentHashMap()\n\n private val logger = KotlinLogging.logger {}\n\n fun startService(context: Context, intent: Intent) {\n val name = intentToClassName(intent)\n\n logger.debug { \"Starting service: $name\" }\n\n val service = serviceInstanceFromClass(name)\n\n runningServices.put(name, service)\n\n //Setup service\n thread {\n callOnCreate(service)\n //TODO Handle more complex cases\n service.onStartCommand(intent, 0, 0)\n }\n }\n\n fun stopService(context: Context, intent: Intent) {\n val name = intentToClassName(intent)\n stopService(name)\n }\n\n fun stopService(name: String) {\n logger.debug { \"Stopping service: $name\" }\n val service = runningServices.remove(name)\n if(service == null) {\n logger.warn { \"An attempt was made to stop a service that is not running: $name\" }\n } else {\n thread {\n service.onDestroy()\n }\n }\n }\n\n fun stopSelf(service: Service) {\n stopService(service.javaClass.name)\n }\n\n fun callOnCreate(service: Service) = service.onCreate()\n\n fun intentToClassName(intent: Intent) = intent.component.className!!\n\n fun serviceInstanceFromClass(className: String): Service {\n val clazzObj = Class.forName(className)\n return clazzObj.newInstance() as? Service\n ?: throw IllegalArgumentException(\"$className is not a Service!\")\n }\n}" +"function [pos, tri] = mesh_sphere(n, method)\n\n% MESH_SPHERE creates spherical mesh, with approximately nvertices vertices\n%\n% Use as\n% [pos, tri] = mesh_sphere(numvertices, method)\n%\n% The input parameter 'n' specifies the (approximate) number of vertices.\n% Once log4((n-2)/10) is an integer, the mesh will be based on an icosahedron.\n% Once log4((n-2)/4) is an integer, the mesh will be based on a refined octahedron.\n% Once log4((n-2)/2) is an integer, the mesh will be based on a refined tetrahedron.\n% Otherwise, an msphere will be used. If n is empty, or undefined, a 12 vertex\n% icosahedron will be returned.\n%\n% The input parameter 'method' defines which function to use when an refined\n% icosahedron, octahedron or tetrahedron is not possible, and can be 'msphere'\n% (default), or 'ksphere'.\n%\n% See also MESH_TETRAHEDRON, MESH_OCTAHEDRON, MESH_ICOSAHEDRON\n\n% Copyright (C) 2002, Robert Oostenveld\n% Copyright (C) 2019, Robert Oostenveld and Jan-Mathijs Schoffelen\n%\n% This file is part of FieldTrip, see http://www.fieldtriptoolbox.org\n% for the documentation and details.\n%\n% FieldTrip is free software: you can redistribute it and/or modify\n% it under the terms of the GNU General Public License as published by\n% the Free Software" +"DocType is the basic building block of an application and encompasses all the three elements i.e. model, view and controller. It represents a:\n\n- Table in the database\n- Form in the application\n- Controller (class) to execute business logic\n\n#### Single Type\n\nDocTypes can be of \"Single\" type where they do not represent a table, and only one instance is maintained. This can be used where the DocType is required only for its view features or to store some configurations in one place.\n\n#### Child Tables\n\nDocTypes can be child tables of other DocTypes. In such cases, they must defined `parent`, `parenttype` and `parentfield` properties to uniquely identify its placement.\n\nIn the parent DocType, the position of a child in the field sequence is defined by the `Table` field type." +"package define\n\nconst (\n\t// HealthCheckHealthy describes a healthy container\n\tHealthCheckHealthy string = \"healthy\"\n\t// HealthCheckUnhealthy describes an unhealthy container\n\tHealthCheckUnhealthy string = \"unhealthy\"\n\t// HealthCheckStarting describes the time between when the container starts\n\t// and the start-period (time allowed for the container to start and application\n\t// to be running) expires.\n\tHealthCheckStarting string = \"starting\"\n)\n\n// HealthCheckStatus represents the current state of a container\ntype HealthCheckStatus int\n\nconst (\n\t// HealthCheckSuccess means the health worked\n\tHealthCheckSuccess HealthCheckStatus = iota\n\t// HealthCheckFailure means the health ran and failed\n\tHealthCheckFailure HealthCheckStatus = iota\n\t// HealthCheckContainerStopped means the health check cannot\n\t// be run because the container is stopped\n\tHealthCheckContainerStopped HealthCheckStatus = iota\n\t// HealthCheckContainerNotFound means the container could\n\t// not be found in local store\n\tHealthCheckContainerNotFound HealthCheckStatus = iota\n\t// HealthCheckNotDefined means the container has no health\n\t// check defined in it\n\tHealthCheckNotDefined HealthCheckStatus = iota\n\t// HealthCheckInternalError means some something failed obtaining or running\n\t// a given health check\n\tHealthCheckInternalError HealthCheckStatus = iota\n\t// HealthCheckDefined means the healthcheck was found on the container\n\tHealthCheckDefined HealthCheckStatus = iota\n)" +"//! Standard return type for invoking operations, returning success or an error\n//! code.\n//!\n//! - Author: Philip Levis \n//! - Date: Dec 22, 2016\n\n/// Standard return errors in Tock.\n#[derive(Clone, Copy, Debug, PartialEq)]\npub enum ReturnCode {\n /// Success value must be positive\n SuccessWithValue { value: usize },\n /// Operation completed successfully\n SUCCESS,\n /// Generic failure condition\n FAIL,\n /// Underlying system is busy; retry\n EBUSY,\n /// The state requested is already set\n EALREADY,\n /// The component is powered down\n EOFF,\n /// Reservation required before use\n ERESERVE,\n /// An invalid parameter was passed\n EINVAL,\n /// Parameter passed was too large\n ESIZE,\n /// Operation canceled by a call\n ECANCEL,\n /// Memory required not available\n ENOMEM,\n /// Operation or command is unsupported\n ENOSUPPORT,\n /// Device does not exist\n ENODEVICE,\n /// Device is not physically installed\n EUNINSTALLED,\n /// Packet transmission not acknowledged\n ENOACK,\n}\n\nimpl From for isize {\n fn from(original: ReturnCode) -> isize {\n match original {\n ReturnCode::SuccessWithValue { value } => value as isize,\n ReturnCode::SUCCESS => 0,\n ReturnCode::FAIL => -1,\n ReturnCode::EBUSY => -2,\n ReturnCode::EALREADY => -3,\n ReturnCode::EOFF => -4,\n ReturnCode::ERESERVE => -5,\n ReturnCode::EINVAL => -6,\n ReturnCode::ESIZE => -7,\n ReturnCode::ECANCEL => -8,\n ReturnCode::ENOMEM => -9,\n ReturnCode::ENOSUPPORT => -10,\n ReturnCode::ENODEVICE" +"#!/usr/bin/env bash\n\n#!/bin/bash\n\nnchains=20 # number of chains to create\nnentries=0 # number of entries to add to each chain\n\nfa1=$(factom-cli importaddress Fs3E9gV6DXsYzf7Fqx1fVBQPQXV695eP3k5XbmHEZVRLkMdD9qCK)\n\nec1=$(factom-cli importaddress Es3LB2YW9bpdWmMnNQYb31kyPzqnecsNqmg5W4K7FKp4UP6omRTa)\n\necho \"Buying\" 1000 $fa1 $ec1\nfactom-cli buyec $fa1 $ec1 100\nsleep 5s\n\naddentries() {\n # create a random datafile\n\tdatalen=$(shuf -i 100-9900 -n 1)\n\tdatafile=$(mktemp)\n\tbase64 /dev/urandom | head -c $datalen > $datafile\n\n\techo \"Entry Length \" $datalen \" bytes, file name: \" $datafile\n\n\tlet y=$(shuf -i 30-120 -n 1)\n\techo \"sleep\" $y \" seconds before writing entries\"\n\tsleep $y\n\tfor ((i=0; i output_List] := Module[\n {inputLength, inputPermutations, heldOutput},\n inputLength = Length @ input[[1]];\n\n inputPermutations = Permutations @ input[[1]];\n heldOutput = Thread @ Hold @ output;\n\n With[{right = heldOutput, condition = input[[2]]}, (* condition is already held before it's passed here *)\n # /; condition :> right & /@ inputPermutations] /. Hold[expr_] :> expr\n] \n\n(* Now, if there are new vertices that need to be created, we will disassemble the Module remembering which variables\n it applies to, and then reassemble it for the output. *)\n\nallLeftHandSidePermutations[input_Condition :> output_Module] := Module[\n {ruleInputOriginal = input[[1]],\n ruleCondition = heldPart[input, 2],\n heldModule = mapHold[output, {0, 1}],\n moduleInputContents},\n moduleInputContents = heldModule[[1, 2]];\n With[{ruleInputFinal = #[[1]],\n moduleArguments = heldModule[[1, 1]],\n moduleOutputContents = (Hold /@ #)[[2]]},\n ruleInputFinal :> Module[moduleArguments, moduleOutputContents]\n ] & /@ allLeftHandSidePermutations[\n ruleInputOriginal /; ruleCondition" +"id: dsq-747532430\ndate: 2010-06-09T03:31:09.0000000-07:00\nname: Johan Driessen\navatar: https://disqus.com/api/users/avatars/nahojd.jpg\nmessage: \"

I couldn't get the upgrader tool to work at all. Or rather, it ran, but it totally destroyed my blog in the process. I got (at least) 3 serious errors:
1. It added a <httpErrors> section in web.config under <system.webServer>, which got me the error message \\\"Config Error: This configuration section cannot be used at this path.\\\"
2. It removed my connectionstring. Yep, that's right, totally cleared out the connectionStrings section of web.config
3. Once I commented out the <httpErrors> section and put my connection string back, I got the \\\"upgrading\\\" page, but after that i couldn't log in. Not with my admin account, and not with OpenID.
So by then I just gave up on upgrading... If it helps, I'm currently running 2.1.2.2 on Win2K8 R2.
Is there some manual way to do the upgrade?

\"" +"# Description\r\n\r\nThis tool takes a CSV file with basic metadata and expand it into a source file(s) for use in Delphi based on pre-defined templates.\r\n\r\nUsing single and multi-file templates you can make almost any unit or form. Included example includes generating a *mORMot* `TSQLRecord` descendant as well as a CDS-based form for editing a single record/object.\r\n\r\nIt is not intended as a full-fledged RAD tool but rather to generate the bulk of repetitive code so you can copy to your project and edit from there on.\r\n\r\n# Forum Thread\r\n\r\nSee http://synopse.info/forum/viewtopic.php?id=1911\r\n\r\n# Usage\r\n\r\n Create CSV file \r\n\r\ne.g. `SampleObj.csv`:\r\n\r\n Code,RawUTF8,Edit,30\r\n Desc,RawUTF8,Edit,512\r\n ItemType,RawUTF8,ComboBox,30\r\n Cost,Currency,Edit\r\n LastCostD,TDateTime,DateEdit\r\n VatCat,Integer,RadioGroup\r\n Active,Boolean,CheckBox\r\n\r\n# Format\r\n\r\n , DataType, Control[, Size]\r\n\r\nSave with `FileName` of class name, e.g. `SampleObj.csv` would create e.g. classes:\r\n\r\n DataSampleObj.pas (class TSQLSampleObj)\r\n SampleObjFormU.pas (Class TSampleObjForm)\r\n SampleObjFormU.dfm\r\n\r\n\r\nWhen creating template, keep in mind the following, there are few magic-cookies that get replaced with your text and some tags.\r\n\r\nSome magic-cookies:\r\n\r\n* MyObj = ClassName, e.g. SampleObj (Determined by filename)\r\n* MyName = property name e.g. BirthDate\r\n* MyType = property type, e.g. `RawUTF8`\r\n* MyField = property CDS Field type `TStringField`\r\n* MyFieldAs = CDS field get/setter str, e.g. `AsString`\r\n* MyControl = If" +"title: Time Statistics on KVM Exit Reasons\nname: kvm_service_time.stp\nversion: 1.0\nauthor: William Cohen\nkeywords: _best virtualization kvm\nsubsystem: virtualization\nstatus: experimental\nexit: user-controlled\noutput: sorted-list\nscope: system-wide\ndescription: The kvm_service_time.stp script tracks the statistics about the amount of time that the processor left the guest virtual machine for each exit reason (for example fixing up a page table or handling an IO operation). When the script exits it prints out the number of times each exit reason was encountered, the total duration of time it left the guest VM, the minimum time, the average time, and the maximum time in microseconds for that exit reason. On Linux 2.6.38 and newer kernel the script can automatically determine whether it is running on Intel or AMD processors. For older kernels with a kernel.trace(\"kvm_exit\") tracepoint that does not have the $isa parameter you can explicitly state the kvm type with a \"-G kvm=intel\" or \"-G kvm=amd\" on the command line.\ntest_support: stap -l 'kernel.trace(\"kvm_entry\")' && stap -l 'kernel.trace(\"kvm_exit\")'\ntest_check: stap -p4 kvm_service_time.stp\ntest_installcheck: stap kvm_service_time.stp -c \"sleep 1\"" +"# Docker Builds for he-transformer with a _Reference-OS_\n\n## Introduction\n\nThis directory contains a basic build system for creating docker images of the _reference-OS_ on which he-transformer builds and unit tests are run. The purpose is to provide reference builds for _Continuous Integration_ used in developing and testing he-transformer.\n\nThe `Makefile` provides targets for:\n\n* Building the _reference-OS_ into a docker image\n* Building he-transformer and running unit tests in this cloned repo, mounted into the docker image of the _reference-OS_\n* Starting an interactive shell in the _reference-OS_ docker image, with the cloned repo available for manual builds and unit testing\n\nThe _make_ targets are designed to handle all aspects of building the _reference-OS_ docker image, running he-transformer builds and unit testing in it, and opening up a session in the docker image for interactive use. You should not need to issue any manual commands (unless you want to). In addition the `Dockerfile.he-transformer.*` files provide a description of how each _reference-OS_ environment is built, should you want to build your own server or docker image.\n\n## Prerequisites\n\nIn order to use the _make_ targets, you will need to do the following:\n\n* Have *docker* installed on your computer with" +"[en]\nCMD_MENU = Commands Menu\nCONF_MENU = Configs Menu\nSPE_MENU = Speech Menu\n\n[de]\nCMD_MENU = Men\u00fc > Befehle\nCONF_MENU = Men\u00fc > Konfiguration\nSPE_MENU = Men\u00fc > Sprechen\n\n[sr]\nCMD_MENU = Komandne\nCONF_MENU = Podesavanja\nSPE_MENU = Govorne Komande\n\n[tr]\nCMD_MENU = Komut Menu\nCONF_MENU = Config Menu\nSPE_MENU = Konusma Menusu\n\n[fr]\nCMD_MENU = Menu Commandes\nCONF_MENU = Menu Configurations\nSPE_MENU = Menu Voix/Paroles\n\n[sv]\nCMD_MENU = Kommandomeny\nCONF_MENU = Konfigurationsmeny\nSPE_MENU = Talmeny\n\n[da]\nCMD_MENU = Kommando Menu\nCONF_MENU = Konfigurations Menu\nSPE_MENU = Tale Menu\n\n[pl]\nCMD_MENU = Menu komend\nCONF_MENU = Menu konfiguracji\nSPE_MENU = Menu rozmowy\n\n[nl]\nCMD_MENU = Commandomenu\nCONF_MENU = Configuratiemenu\nSPE_MENU = Spraakmenu\n\n[es]\nCMD_MENU = Menu de Comandos\nCONF_MENU = Menu de Configuracion\nSPE_MENU = Menu de Voz\n\n[bp]\nCMD_MENU = Menu de Comandos\nCONF_MENU = Menu de Configs\nSPE_MENU = Menu de Vozes\n\n[cz]\nCMD_MENU = Menu prikazu\nCONF_MENU = Menu nastaveni\nSPE_MENU = Nastaveni reci\n\n[fi]\nCMD_MENU = Komentovalikko\nCONF_MENU = Saatovalikko\nSPE_MENU = Puhevalikko\n\n[bg]\nCMD_MENU = Menu s komandi\nCONF_MENU = Konfiguracionno menu\nSPE_MENU = Menu za govorene\n\n[ro]\nCMD_MENU = Menu Comenzi\nCONF_MENU = Menu Configuratie\nSPE_MENU = Menu Speech\n\n[hu]\nCMD_MENU = Parancs Men\u00fc" +"# The comprehensions here shouldn't get collapsed to single lines.\n\nGLOB_resources_legacy_txt = glob([\"resources/legacy/*.txt\"])\n\nGLOB_resources_legacy_txt2 = [\n filename_and_some_extra_to_make_this_long\n for filename_and_some_extra_to_make_this_long in GLOB_resources_legacy_txt\n]\n\nlegacy_txt_name_to_filename = {\n filename.split(\"/\")[-1][:-len(\".txt\")].lower(): filename\n for filename in GLOB_resources_legacy_txt_tuple\n}\n\nflags = [\n flag\n for flag in [\n _flag(\n name,\n type(default),\n values.pop(name, default),\n )\n for name, default in defaults.items()\n ]\n if flag != None\n]\n\nflags = [\n flag # Example comment.\n for flag in [\n _flag(\n name,\n type(default),\n values.pop(name, default),\n )\n for name, default in defaults.items()\n ] # Clarififcation.\n if flag != None\n] # Skip empty elements.\n\nflags = [\n # Example comment.\n flag\n # Clarififcation.\n for flag in [\n _flag(\n name,\n type(default),\n values.pop(name, default),\n )\n for name, default in defaults.items()\n ]\n # Skip empty elements.\n if flag != None\n]\n\nflags = [\n\n # Example comment.\n flag\n\n # Clarififcation.\n for flag in [\n _flag(\n name,\n type(default),\n values.pop(name, default),\n )\n for name, default in defaults.items()\n ]\n\n # Skip empty elements.\n if flag != None\n]" +"Chapter 2: Merging\n==================\n\n**Standard merge sort**\n\nMerge sorting works by breaking an array into two halves, over and over again,\nuntil the size of each half is below some threshold. For a threshold of 2 this\nmeans simply swapping the two items in that part of the array if they are out\nof order. For a larger threshold you could use an insertion sort or something.\nOnce we have sorted two halves of the array, we have to *merge* them together\nto arrive at the final sorted array.\n\n MergeSort(array, range)\n if (range.length == 2)\n if (array[range.start] > array[range.end]) Swap(array[range.start], array[range.end])\n else if (range.length > 2)\n mid = range.start + range.length/2\n MergeSort(array, MakeRange(range.start, mid))\n MergeSort(array, MakeRange(mid, range.end))\n\n Merge(array, MakeRange(range.start, mid), MakeRange(mid, range.end))\n\n* * *\n\n**Standard merge**\n\nThe merge operation of the merge sort algorithm takes two arrays that *are\nalready sorted* (either from swapping or insertion sorting, as mentioned above),\nand combines them into a single array containing A and B sorted together.\nThe operation is acutally quite simple: just take the smaller of the two values\nat the start of A and B and add it to the final array. Once A and B are empty,\nthe final" +".\\\" $FreeBSD$\n.\\\" $MCom: portlint/portlint.1,v 1.14 2020/05/30 13:03:55 jclarke Exp $\n.\\\"\n.\\\" Copyright (c) 1997 by Jun-ichiro Hagino .\n.\\\" All Rights Reserved. Absolutely no warranty.\n.\\\"\n.Dd April 1, 2010\n.Dt PORTLINT 1\n.Os\n.Sh NAME\n.Nm portlint\n.Nd a verifier for port directories\n.Sh SYNOPSIS\n.Nm portlint\n.Op Fl abcghvtACNV\n.Op Fl M Ar ENV\n.Op Fl B Ar n\n.Op Ar dir\n.Sh DESCRIPTION\n.Nm\ntries to verify the content of a port directory.\nThe purpose of\n.Nm\ncan be separated into two parts:\n.Pq 1\nto let the submitters easily polish their own port directory, and\n.Pq 2\nto decrease the labor of the committers.\n.Pp\n.Nm\nuses very simple regular-expression matching for verifying\nfiles that make up a port directory.\nNote that it does NOT implement a complete parser for those files.\nBecause of this the user may see some extra warnings,\nespecially when checking complex\n.Pa Makefile Ns No s .\n.Pp\n.Sy Options\n.Bl -tag -width Fl\n.It Fl a\nPerform additional checks for extra files, such as\n.Pa scripts/*\nand\n.Pa pkg-* .\n.It Fl b\nWarn the use of\n.Pa $(VARIABLE) .\nSome of the committers prefer\n.Pa ${VARIABLE}\ninstead" +"\n \u3000\u4e94 \u4e4b\u540e \u80a1\u5e02 \u706b\u7206 \u8fde\u7eed \u5929\u5927 \u4e00\u4e3e \u7a81\u7834 \u4e94\u5927 \u6295\u8d44\u8005 \u878d\u901a \u57fa\u91d1 \u7ba1\u7406 \u516c\u53f8 \u9996\u5e2d \u5206\u6790\u5e08 \u878d\u901a \u884c\u4e1a \u666f\u6c14 \u57fa\u91d1 \u7ecf\u7406 \u51af\u5b87\u8f89 \u8868\u793a \u706b\u7206 \u57fa\u672c\u9762 \u91cd\u4e8e \u6700\u91cd\u8981 \u9009\u80a1 \n \u6295\u8d44\u8005 \u660e\u767d \u575a\u6301 \n \u3000\u51af \u8868\u793a \u57fa\u91d1 \u7ecf\u7406 \u540c\u6837 \u82e6\u82e6 \u575a\u6301 \u7ecf\u9a8c \u53bb\u5e74 \u878d\u901a \u884c\u4e1a \u7814\u7a76\u5458 \u5b9e\u5730 \u8c03\u7814 \u63a8\u8350 \u5e7f\u8239 \u56fd\u9645 \u5f53\u65f6 \u53d7\u7d2f \u94a2\u6750 \u4ef7\u683c \u8fde\u5e74 \u4e0a\u6da8 \u56fd\u5185 \u9020\u8239\u4e1a \u4e00\u76f4 \u5904\u4e8e \u4e0b\u964d \u901a\u9053 \u5e7f\u8239 \u56fd\u9645 \u4e1a\u7ee9 \u60c5\u51b5 \u7406\u60f3 \u6ca1\u6709 \u4e00\u5bb6 \u57fa\u91d1 \u770b\u597d \u6295\u8d44 \u5e7f\u8239 \u56fd\u9645 \u9999\u6e2f \u5e02\u573a \u4e00\u76f4 \u4f4e\u8ff7 \n \u516c\u53f8 \u53ec\u5f00 \u4e13\u9898 \u8bba\u8bc1\u4f1a \u6fc0\u70c8 \u8ba8\u8bba \u516c\u53f8 \u5185\u90e8 \u5f62\u6210 \u5171\u8bc6 \u9020\u8239\u4e1a \u6b63\u5904\u4e8e \u884c\u4e1a \u666f\u6c14 \u5468\u671f \u62d0\u70b9 \u5e7f\u8239 \u56fd\u9645 \u8ba2\u5355 \u60c5\u51b5 \u672a\u6765 \u4e1a\u7ee9 \u5927\u5e45\u5ea6 \u63d0\u5347 \u878d\u901a \u884c\u4e1a \u666f\u6c14 \u57fa\u91d1 \u57fa\u91d1 \u901a\u4e7e \u51b3\u5b9a \u91cd\u4ed3 \u4ecb\u5165 \n \u5176\u540e \u76f8\u5f53 \u65f6\u95f4 \u878d\u901a \u57fa\u91d1 \u4e00\u5bb6 \u673a\u6784 \u770b\u597d \u5e7f\u8239 \u56fd\u9645 \u4ef7\u683c \u4f9d\u7136 \u4f4e\u8ff7 \u53bb\u5e74 \u60c5\u51b5 \u53d8\u5f97 \u4ef7\u683c \u6700\u4f4e \u4e0b\u8dcc \u4ef7\u683c 60% \u5de6\u53f3 \u5f53\u65f6 \u62e5\u6709 \u4e0a\u5e02\u516c\u53f8 \u80a1\u4ef7 \u666e\u904d \u63a5\u8f68 \u4f4e\u4e8e \u4e0d\u5728\u5c11\u6570 \u57fa\u91d1 \u7ecf\u7406 \u5b87\u8f89 \u5766\u627f \u611f\u5230 \u538b\u529b \u5de8\u5927 \u575a\u6301 \u4e0b\u6765 \u516c\u53f8 \u57fa\u672c\u9762 \u4e86\u89e3 \u6e05\u695a \u76f8\u4fe1 \u6700\u7ec8 \u5e02\u573a \u8ba4\u53ef \n \u3000\u51af \u5f88\u5feb \u7ed3\u675f \u96be\u71ac \u65e5\u5b50 \u9999\u6e2f \u5e02\u573a \u7ec8\u4e8e \u8ba4\u8bc6\u5230 \u5e7f\u8239 \u56fd\u9645 \u6295\u8d44 \u4ef7\u503c \u7387\u5148 \u5e26\u52a8 \u8fc5\u901f \u8d70\u9ad8 \u8fdb\u5165 \u5e7f\u8239 \u56fd\u9645 \u4ef7\u683c \u8d85\u8fc7 \u76ee\u524d \u5e7f\u8239 \u56fd\u9645 \u80a1\u6539 \u505c\u724c \u505c\u724c \u6536\u62a5 \u53bb\u5e74 \u4ef7\u683c \u76f8\u6bd4 \u51fa\u5934 \u878d\u901a \u65d7\u4e0b \u57fa\u91d1 \u5e7f\u8239 \u56fd\u9645 \u83b7\u5229 \u8d85\u8fc7 100% \n \u5e02\u573a \u5df2\u7ecf \u80fd\u591f" +"\n\n(*let buffer_intervals (intervals : (int * int) list) buf ic oc =\n intervals\n |> List.iter\n (fun (start, stop) -> \n let len = stop - start in \n if len <> 0 then \n begin\n seek_in ic start ; \n Buffer.add_channel buf ic len ; \n Buffer.output_buffer oc buf ; \n Buffer.clear buf;\n end\n )*)\n \n\nlet preprocess fn oc = \n let ic = open_in_bin fn in \n let lexbuf = Lexing.from_channel ic in \n let buf = Buffer.create 4096 in \n Location.init lexbuf fn;\n Lexer.init ();\n lexbuf\n |> Lexer.filter_directive_from_lexbuf \n (* Get a list of segments\n TODO: output line directive\n *)\n |> List.iter\n (fun (start, stop) -> \n let len = stop - start in \n if len <> 0 then \n begin\n seek_in ic start ; \n Buffer.add_channel buf ic len ; \n Buffer.output_buffer oc buf ; \n Buffer.clear buf;\n end\n );\n close_in ic \n\n\nlet () = \n preprocess Sys.argv.(1) stdout" +"# Interop with an existing React Redux application\n\nThis recipe will guide you through the process of integrating Easy Peasy into your existing React Redux application. It is possible to slowly migrate an existing React Redux application to Easy Peasy without doing a full rewrite. \n\nEasy Peasy outputs a standard Redux store, and allows customisation of the store via the [StoreConfig](/docs/api/store-config.html). Therefore it is possible to configure the Easy Peasy redux store to match the needs of your existing application. You will likely be able to move your store into Easy Peasy without the need to make any changes to your components.\n\nThis would grant you the ability to slowly and carefully refactor your existing React Redux reducers into Easy Peasy models when needed, though there is nothing preventing you from keeping the concepts (Easy Peasy models, and React Redux reducers) living side by side indefinitely.\n\n## Refactoring the creation of your store\n\nImagine you had a Redux store being configured similarly to the following.\n\n```javascript\nimport { createStore, combineReducers, applyMiddleware } from 'redux';\nimport productsReducer from './reducers/products';\nimport basketReducer from './reducers/basket';\nimport loggerMiddleware from './middleware/logger';\n\nconst rootReducer = combineReducers({\n products: productsReducer,\n basket: basketReducer\n});\n\nconst store = createStore(rootReducer, applyMiddleware(loggerMiddleware));" +"#!/bin/bash -e\n\n# maintainer (ask me any questions): rfree\n\nif [[ ! -r \"Doxyfile\" ]] ; then\n\techo \"Error, can not read the Doxyfile - make sure to run this script from top of monero project, where the Doxyfile file is located\"\n\texit 1\nfi\n\nwwwdir=\"$HOME/monero-www/\"\nif [[ ! -w \"$wwwdir\" ]] ; then\n\techo \"Error, can not write into wwwdir=$wwwdir. It should be a directory readable/connected to your webserver, or a symlink to such directory\"\n\texit 1\nfi\n\nif [[ ! -d \"$wwwdir/doc\" ]] ; then\n\techo \"Creating subdirs\"\n\tmkdir \"$wwwdir/doc\"\nfi\n\necho \"Generating:\"\ndoxygen Doxyfile && echo \"Backup previous version:\" && rm -rf ~/monero-www-previous && mv \"$wwwdir/doc\" ~/monero-www-previous && cp -ar doc/ \"$wwwdir/\" && echo \"Done, builded and copied to public - the doxygen docs\" && echo \"size:\" && du -Dsh \"$wwwdir/\" && echo \"files:\" && find \"$wwwdir/\" | wc -l" +"# Sun\u2019s Network File System (NFS)\n\n## Homework (Measurement)\n\nIn this homework, you\u2019ll do a little bit of NFS trace analysis using real traces. The source of these traces is Ellard and Seltzer\u2019s effort [ES03]. Make sure to read the related README and download the relevant tarball from the OSTEP homework page (as usual) before starting.\n\n### Questions\n\n1. A first question for your trace analysis: using the timestamps found in the first column, determine the period of time the traces were taken from. How long is the period? What day/week/month/year was it? (does this match the hint given in the filename?) Hint: Use the tools `head -1` and `tail -1` to extract the first and last lines of the file, and do the calculation.\n\n [`strftime()`](https://www.gnu.org/software/gawk/manual/html_node/Time-Functions.html) is `gawk` extension, it's not specified in the POSIX standard. The filename is `anon-deasna-021016-1300.txt`, that's the same date.\n\n ```\n $ awk -f q1.awk anon-deasna-021016-1300.txt\n Period: 59.98 minutes\n Start time: 16 10 2002\n ```\n\n2. Now, let\u2019s do some operation counts. How many of each type of operation occur in the trace? Sort these by frequency; which operation is most frequent? Does NFS live up to its reputation?\n\n ```\n $ awk '{ a[$8]++} END {for" +"China's stock markets saw heavy turnover on Monday but little change to the key indices as the impact of Deng Xiaoping's death faded and profit-taking set in, analysts said.\nInvestors expect little chance of any political problems emerging to shake the markets, at least for the next few weeks, they said.\n\"It seems to be all very stable,\" said stock analyst Alex Conroy with ING-Barings in Shanghai. \"I expect range trading for a while. There's no particular reason to be trading in a frenzy.\"\nShanghai's domestic A share index edged down 0.77 percent to 1,055.078 points on heavy volume of 1.02 billion shares while the B share index was little changed at 65.707 points, inching up just 0.21 percent against the close on Friday last week.\nIn Shenzhen, the A index rose 0.95 percent to 362.69 points and the B index lost 0.15 percent to 150.73 points.\n\"The markets were under profit-taking pressure after the past two trading days of rises,\" said a Shanghai-based A share dealer with China Guotai Securities.\n\"Institutional speculative buying of heavily-weighted stocks pushed the (Shanghai B share) index up, but most investors took a wait-and-see attitude as Deng's death has been factored in and there" +"package lemongrenade.core.database.lemongraph;\n\nimport org.json.JSONArray;\nimport org.json.JSONObject;\n/**\n * \t(Content-Type: application/json, application/x-msgpack)\n * body should be object w/ the optional kv pairs below\n * {\n * \"seed\": true, // mark any referenced nodes/edges as seeds\n * \"meta\": {}, // same structure as POST /graph, merges graph-level kv pairs\n * 'chains': [], // list of node[to edge to node]* object chains\n * 'nodes': [], // list of nodes\n * 'edges': [], // list of edges\n * }\n */\npublic class LemonGraphObject {\n private boolean isSeed;\n private JSONObject meta;\n private JSONArray nodes;\n private JSONArray edges;\n private JSONArray chains;\n\n public JSONObject get() {\n JSONObject graph = new JSONObject();\n graph.put(\"nodes\", nodes);\n graph.put(\"edges\", edges);\n graph.put(\"seed\", isSeed);\n graph.put(\"meta\", meta);\n graph.put(\"chains\", chains);\n return graph;\n }\n\n public JSONArray getNodes() { return nodes; }\n public void setNodes(JSONArray nodes) { this.nodes = nodes; }\n public JSONArray getEdges() { return edges; }\n public void setEdges(JSONArray edges) { this.edges = edges; }\n public JSONArray getChains() { return chains; }\n public void setChains(JSONArray chains) { this.edges = chains; }\n public JSONObject getMeta() { return meta; }\n public void setMeta(JSONObject meta) { this.meta = meta; }\n public void setSeed(boolean seed) { this.isSeed = seed;}\n public boolean getSeed() { return this.isSeed; }\n\n public LemonGraphObject(boolean isSeed, JSONObject" +"/*++\n/* NAME\n/*\trewrite_clnt 3\n/* SUMMARY\n/*\taddress rewrite service client\n/* SYNOPSIS\n/*\t#include \n/*\t#include \n/*\n/*\tVSTRING\t*rewrite_clnt(ruleset, address, result)\n/*\tconst char *ruleset;\n/*\tconst char *address;\n/*\n/*\tVSTRING\t*rewrite_clnt_internal(ruleset, address, result)\n/*\tconst char *ruleset;\n/*\tconst char *address;\n/*\tVSTRING\t*result;\n/* DESCRIPTION\n/*\tThis module implements a mail address rewriting client.\n/*\n/*\trewrite_clnt() sends a rule set name and external-form address to the\n/*\trewriting service and returns the resulting external-form address.\n/*\tIn case of communication failure the program keeps trying until the\n/*\tmail system shuts down.\n/*\n/*\trewrite_clnt_internal() performs the same functionality but takes\n/*\tinput in internal (unquoted) form, and produces output in internal\n/*\t(unquoted) form.\n/* DIAGNOSTICS\n/*\tWarnings: communication failure. Fatal error: mail system is down.\n/* SEE ALSO\n/*\tmail_proto(3h) low-level mail component glue.\n/* LICENSE\n/* .ad\n/* .fi\n/*\tThe Secure Mailer license must be distributed with this software.\n/* AUTHOR(S)\n/*\tWietse Venema\n/*\tIBM T.J. Watson Research\n/*\tP.O. Box 704\n/*\tYorktown Heights, NY 10598, USA\n/*--*/\n\n/* System library. */\n\n#include \n#include \n#include \n#include \n\n/* Utility library. */" +"//! A segment time achieved for a segment. It is tagged with an index. Only\n//! segment times with an index larger than 0 are considered times actually\n//! achieved by the runner, while the others are artifacts of route changes and\n//! similar algorithmic changes.\n\nuse super::output_time;\nuse livesplit_core::Time;\n\n/// type\npub type SegmentHistoryElement = (i32, Time);\n/// type\npub type NullableSegmentHistoryElement = SegmentHistoryElement;\n/// type\npub type OwnedSegmentHistoryElement = Box;\n\n/// Accesses the index of the segment history element.\n#[no_mangle]\npub extern \"C\" fn SegmentHistoryElement_index(this: &SegmentHistoryElement) -> i32 {\n this.0\n}\n\n/// Accesses the segment time of the segment history element.\n#[no_mangle]\npub extern \"C\" fn SegmentHistoryElement_time(this: &SegmentHistoryElement) -> *const Time {\n output_time(this.1)\n}" +"# encoding: utf-8\n#\n# This file contains transliteration rules for Vietnamese\n#\n# To validate this YAML file after you change it, please paste it into\n# http://yamllint.com/\n\nvi:\n i18n:\n transliterate:\n rule:\n \u00e0: \"a\"\n \u00e1: \"a\"\n \u1ea1: \"a\"\n \u1ea3: \"a\"\n \u00e3: \"a\"\n \u00e2: \"a\"\n \u1ea7: \"a\"\n \u1ea5: \"a\"\n \u1ead: \"a\"\n \u1ea9: \"a\"\n \u1eab: \"a\"\n \u0103: \"a\"\n \u1eb1: \"a\"\n \u1eaf: \"a\"\n \u1eb7: \"a\"\n \u1eb3: \"a\"\n \u1eb5: \"a\"\n \u00e8: \"e\"\n \u00e9: \"e\"\n \u1eb9: \"e\"\n \u1ebb: \"e\"\n \u1ebd: \"e\"\n \u00ea: \"e\"\n \u1ec1: \"e\"\n \u1ebf: \"e\"\n \u1ec7: \"e\"\n \u1ec3: \"e\"\n \u1ec5: \"e\"\n \u00ec: \"i\"\n \u00ed: \"i\"\n \u1ecb: \"i\"\n \u1ec9: \"i\"\n \u0129: \"i\"\n \u00f2: \"o\"\n \u00f3: \"o\"\n \u1ecd: \"o\"\n \u1ecf: \"o\"\n \u00f5: \"o\"\n \u00f4: \"o\"\n \u1ed3: \"o\"\n \u1ed1: \"o\"\n \u1ed9: \"o\"\n \u1ed5: \"o\"\n \u1ed7: \"o\"\n \u01a1: \"o\"\n \u1edd: \"o\"\n \u1edb: \"o\"\n \u1ee3: \"o\"\n \u1edf: \"o\"\n \u1ee1: \"o\"\n \u00f9: \"u\"\n \u00fa: \"u\"\n \u1ee5: \"u\"\n \u1ee7: \"u\"\n \u0169: \"u\"\n \u01b0: \"u\"\n \u1eeb: \"u\"\n \u1ee9: \"u\"\n \u1ef1: \"u\"\n \u1eed: \"u\"\n \u1eef: \"u\"\n \u1ef3: \"y\"\n \u00fd: \"y\"\n \u1ef5: \"y\"\n \u1ef7: \"y\"\n \u1ef9: \"y\"\n \u0111: \"d\"\n \u00c0: \"A\"\n \u00c1: \"A\"\n \u1ea0: \"A\"\n \u1ea2: \"A\"\n \u00c3: \"A\"\n \u00c2: \"A\"\n \u1ea6: \"A\"\n \u1ea4: \"A\"\n \u1eac: \"A\"\n \u1ea8: \"A\"\n \u1eaa: \"A\"\n \u0102: \"A\"\n \u1eb0: \"A\"\n \u1eae: \"A\"\n \u1eb6: \"A\"\n \u1eb2: \"A\"\n \u1eb4:" +"package pureconfig\n\nimport org.scalacheck.{Arbitrary, Gen}\nimport pureconfig.gen._\n\npackage object arbitrary {\n\n implicit val arbDuration = Arbitrary(genDuration)\n implicit val arbJavaDuration = Arbitrary(genJavaDuration)\n implicit val arbFiniteDuration = Arbitrary(genFiniteDuration)\n implicit val arbInstant = Arbitrary(genInstant)\n implicit val arbPeriod = Arbitrary(genPeriod)\n implicit val arbYear = Arbitrary(genYear)\n implicit val arbUUID = Arbitrary(Gen.uuid)\n implicit val arbPath = Arbitrary(genPath)\n implicit val arbFile = Arbitrary(genFile)\n implicit val arbPercentage = Arbitrary(genPercentage)\n implicit val arbJavaBigDecimal = Arbitrary(genJavaBigDecimal)\n implicit val arbJavaBigInteger = Arbitrary(genBigInt)\n\n implicit val arbLocalTime = Arbitrary(genLocalTime)\n implicit val arbLocalDate = Arbitrary(genLocalDate)\n implicit val arbLocalDateTime = Arbitrary(genLocalDateTime)\n implicit val arbMonthDay = Arbitrary(genMonthDay)\n implicit val arbZoneOffset = Arbitrary(genZoneOffset)\n implicit val arbOffsetDateTime = Arbitrary(genOffsetDateTime)\n implicit val arbOffsetTime = Arbitrary(genOffsetTime)\n implicit val arbYearMonth = Arbitrary(genYearMonth)\n implicit val arbZoneId = Arbitrary(genZoneId)\n implicit val arbZonedDateTime = Arbitrary(genZonedDateTime)\n}" +"# ADR 0026 - Post-hoc Metrics Scripting\n\n## Context\n\n[ADR 24](./adr-0024-metrics-scripting.md) answered the question of how to\ntrack metrics for events in Raster Foundry servers. It doesn't answer how to\ntrack events for requests elsewhere, for example, total logins by user, or non-events,\nlike storage used for uploads. Since\nwe'd also like metrics from third party services and ongoing usage data in the database\nfor querying and eventual visualization, we need to determine a strategy for either\npushing or regularly importing those events as well.\n\n## Options\n\n### Events\n\n#### Option 1 -- webhooks\n\nIn this option, basically there's no such thing as post-hoc metrics, since\nwe use a hook to send an event to our API whenever something occurs that we're\ninterested in. This strategy seemed like it might be possible for Auth0 logins,\nwhich are where I focused experimentation.\n\nWe can execute almost arbitrary javascript after credential exchange via the\n\"Client Credential Exchange\" hook. This means we could fire off a request to\nincrement the login count for a user as part of the login hook. See the \n`make-bogus-request` hook in the staging tenant for an example of what this looks\nlike, and [Papertrail logs](https://papertrailapp.com/groups/4082183/events?q=minCloudCover%3D9999)\nfor evidence that" +"# VIM\u64cd\u4f5c\u901f\u8bb0\n\n\n| \u547d\u4ee4 | \u5feb\u6377\u952e | \u64cd\u4f5c\u8bf4\u660e |\n| --- | --- | --- |\n| | Ctrl-w w | \u5149\u6807\u5728\u7a97\u53e3\u4e4b\u95f4\u5207\u6362 |\n| | Ctrl-w h, j, k, l | \u5149\u6807\u5728\u7a97\u53e3\u4e4b\u95f4\u5207\u6362 |\n| | Ctrl-w r | \u5207\u6362\u5f53\u524d\u7a97\u53e3\u5e03\u5c40\u4f4d\u7f6e |\n| | Ctrl-w p | \u5207\u6362\u5230\u524d\u4e00\u4e2a\u7a97\u53e3 | \n| :help table | | tab\u4f7f\u7528\u5e2e\u52a9 |\n| :tabp | gt | \u524d\u4e00\u4e2atab\u9875 |\n| :tabn | gT | \u540e\u4e00\u4e2atab\u9875 |\n| :tabc | | \u5173\u95ed\u5f53\u524dtab |\n| :tabo | | \u5173\u95ed\u5176\u4ed6tab |\n| :tabs | | \u67e5\u770b\u6240\u6709\u6253\u5f00\u7684tab |\n| :split | | \u7a97\u53e3\u6a2a\u5411\u62c6\u5206 |\n| :vsplit | | \u7a97\u53e3\u7eb5\u5411\u62c6\u5206 |\n| | Ctrl-w = | \u8ba9\u6240\u6709\u7a97\u53e3\u5747\u5206 |\n| :resize (-/+)n | Ctrl-w -/+ | \u5f53\u524d\u7a97\u53e3\u51cf\u5c11/\u589e\u52a0\u4e00\uff08n\uff09\u884c |\n| :vertical resize (+/-)n | Ctrl-w >/< | \u5f53\u524d\u7a97\u53e3\u589e\u52a0\u3001\u51cf\u5c11\u4e00\uff08n\uff09\u5217 |\n\n\n\n## nerdtree\n\n\n| \u547d\u4ee4 | \u5feb\u6377\u952e | \u64cd\u4f5c\u8bf4\u660e |\n| --- | --- | --- |\n| :NERDTreeToggle | Ctrl-n | \u6253\u5f00\u3001\u5173\u95ed\u5de6\u4fa7\u5bfc\u822a\u680f |\n| | t | \u5728\u65b0\u7684Tab\u4e2d\u6253\u5f00\u6587\u4ef6 |\n| | i | \u6a2a\u5411\u5206\u5c4f\u6253\u5f00\u6587\u4ef6 |\n| | s | \u7eb5\u5411\u5206\u5c4f\u6253\u5f00\u6587\u4ef6 |\n| | p | \u8df3\u5230\u5f53\u524d\u8282\u70b9\u7684\u7236\u8282\u70b9 |\n| | P | \u8df3\u5230\u5f53\u524d\u8282\u70b9\u7684\u6839\u8282\u70b9 |\n\n\u6587\u6863\uff1ahttps://github.com/scrooloose/nerdtree/blob/master/doc/NERDTree.txt\n\n## vim-go\n\n\n| \u547d\u4ee4 | \u5feb\u6377\u952e | \u64cd\u4f5c\u8bf4\u660e |\n| --- | --- | --- |\n| :GoDef | Ctrl-] | \u8f6c\u5230\u5b9a\u4e49" +"Germany's Dresdner Bank and Dutch bank ABN AMRO Holding NV may covet a larger slice of lucrative British fund management business, but analysts warned on Wednesday their shopping lists will be limited.\n\"Some of the smaller quoted firms might get snapped up,\" one banking analyst told Reuters, adding the price range for quoted asset management companies varied from between around 125 million pounds ($207.6 million) to several billion pounds.\n\"There's a large range of value out there, but nothing is on offer,\" he said.\nWell-known top performers such as Mercury Asset Management (MAM) are likely to be prohibitively expensive and none of the major players have indicated they are up for grabs.\nAnd the weaker fund firms are not likely to be of such great interest to European banks trying to make an immediate impact on the market rather than turn around a non-performer.\n\"There's an active trading market for these businesses but some banks feel the prices are too high,\" John Leonard, banking analyst at Salomon Brothers said.\nBanks are not alone in their desire to grab more asset management business, with insurance companies also in the frame. And as with other areas of financial services completely new entrants" +"// sourcery:file: skipEquality\nimport Foundation\n\n/// Class that represents a project element.\npublic class PBXObject: Hashable, Decodable, Equatable, AutoEquatable {\n /// Returns the unique identifier.\n /// Note: The unique identifier of an object might change when the project gets written.\n /// If you use this identifier from a scheme, make sure the project is written before the project is.\n public var uuid: String {\n reference.value\n }\n\n /// The object reference in the project that contains it.\n let reference: PBXObjectReference\n\n /**\n Used to differentiate this object from other equatable ones for the purposes of uuid generation.\n\n This shouldn't be required to be set in normal circumstances.\n In some rare cases xcodeproj doesn't have enough context about otherwise equatable objects,\n so it has to resolve automatic uuid conflicts by appending numbers.\n This property can be used to provide more context to disambiguate these objects,\n which will result in more deterministic uuids.\n */\n public var context: String?\n\n // MARK: - Init\n\n init() {\n reference = PBXObjectReference()\n reference.setObject(self)\n }\n\n // MARK: - Decodable\n\n fileprivate enum CodingKeys: String, CodingKey {\n case reference\n }\n\n /// Initializes the object from its project representation.\n ///\n /// - Parameter decoder: XcodeprojPropertyListDecoder decoder.\n /// - Throws: an error if the" +"7\n13\n10\n9\n7\n16\n4\n6\n8\n13\n11\n17\n7\n14\n13\n4\n10\n20\n9\n4\n9\n5\n19\n6\n4\n6\n9\n4\n11\n13\n13\n5\n14\n9\n13\n11\n9\n14\n7\n11\n5\n13\n12\n7\n7\n13\n9\n7\n9\n9\n6\n11\n9\n11\n16\n7\n17\n11\n13\n13\n9\n9\n11\n11\n11\n9\n9\n8\n9\n9\n13\n11\n11\n6\n8\n16\n16\n13\n9\n4\n6\n4\n18\n9\n8\n7\n7\n7\n11\n9\n10\n5\n5\n12\n2\n4\n5\n6\n11\n8\n9\n6\n11\n13\n20\n8\n19\n6\n13\n7\n13\n11\n11\n7\n7\n13\n16\n17\n15\n11\n13\n5\n11\n13\n18\n11\n11\n13\n19\n11\n18\n13\n19\n16\n15\n17\n7\n9\n17\n17\n8\n11\n6\n13\n9\n13\n15\n9\n15\n21\n13\n15\n15\n18\n7\n18\n11\n13\n20\n31\n10\n8\n28\n22\n9\n7\n15\n9\n7\n7\n27\n18\n42\n12\n15\n11\n13\n7\n15\n26\n20\n11\n15\n15\n13\n14\n11\n18\n11\n24\n9\n22\n21\n33\n32\n24\n18\n30\n34\n16" +"# Introduction to modules\n\nAngular apps are modular and Angular has its own modularity system called *NgModules*.\nNgModules are containers for a cohesive block of code dedicated to an application domain, a workflow, or a closely related set of capabilities. They can contain components, service providers, and other code files whose scope is defined by the containing NgModule. They can import functionality that is exported from other NgModules, and export selected functionality for use by other NgModules.\n\nEvery Angular app has at least one NgModule class, [the *root module*](guide/bootstrapping), which is conventionally named `AppModule` and resides in a file named `app.module.ts`. You launch your app by *bootstrapping* the root NgModule.\n\nWhile a small application might have only one NgModule, most apps have many more *feature modules*. The *root* NgModule for an app is so named because it can include child NgModules in a hierarchy of any depth.\n\n## NgModule metadata\n\nAn NgModule is defined by a class decorated with `@NgModule()`. The `@NgModule()` decorator is a function that takes a single metadata object, whose properties describe the module. The most important properties are as follows.\n\n* `declarations`: The [components](guide/architecture-components), *directives*, and *pipes* that belong to this NgModule.\n\n* `exports`: The subset" +"import { LocalCurrencyCode } from 'src/localCurrency/consts'\n\nexport enum Actions {\n FETCH_CURRENT_RATE = 'LOCAL_CURRENCY/FETCH_CURRENT_RATE',\n FETCH_CURRENT_RATE_SUCCESS = 'LOCAL_CURRENCY/FETCH_CURRENT_RATE_SUCCESS',\n FETCH_CURRENT_RATE_FAILURE = 'LOCAL_CURRENCY/FETCH_CURRENT_RATE_FAILURE',\n SELECT_PREFERRED_CURRENCY = 'LOCAL_CURRENCY/SELECT_PREFERRED_CURRENCY',\n}\nexport interface FetchCurrentRateAction {\n type: Actions.FETCH_CURRENT_RATE\n}\n\nexport interface FetchCurrentRateSuccessAction {\n type: Actions.FETCH_CURRENT_RATE_SUCCESS\n currencyCode: LocalCurrencyCode\n exchangeRate: string\n now: number\n}\n\nexport interface FetchCurrentRateFailureAction {\n type: Actions.FETCH_CURRENT_RATE_FAILURE\n}\n\nexport interface SelectPreferredCurrencyAction {\n type: Actions.SELECT_PREFERRED_CURRENCY\n currencyCode: LocalCurrencyCode\n}\n\nexport type ActionTypes =\n | FetchCurrentRateAction\n | FetchCurrentRateSuccessAction\n | FetchCurrentRateFailureAction\n | SelectPreferredCurrencyAction\n\nexport const fetchCurrentRate = (): FetchCurrentRateAction => ({\n type: Actions.FETCH_CURRENT_RATE,\n})\n\nexport const fetchCurrentRateSuccess = (\n currencyCode: LocalCurrencyCode,\n exchangeRate: string,\n now: number\n): FetchCurrentRateSuccessAction => ({\n type: Actions.FETCH_CURRENT_RATE_SUCCESS,\n currencyCode,\n exchangeRate,\n now,\n})\n\nexport const fetchCurrentRateFailure = (): FetchCurrentRateFailureAction => ({\n type: Actions.FETCH_CURRENT_RATE_FAILURE,\n})\n\nexport const selectPreferredCurrency = (\n currencyCode: LocalCurrencyCode\n): SelectPreferredCurrencyAction => ({\n type: Actions.SELECT_PREFERRED_CURRENCY,\n currencyCode,\n})" +"#!/usr/bin/env ruby\n# vim: set nosta noet ts=4 sw=4:\n#\n# Script to automatically move partitioned tables and their indexes\n# to a separate area on disk.\n#\n# Mahlon E. Smith \n#\n# Example use case:\n#\n# - You've got a heavy insert table, such as syslog data.\n# - This table has a partitioning trigger (or is manually partitioned)\n# by date, to separate incoming stuff from archival/report stuff.\n# - You have a tablespace on cheap or slower disk (maybe even\n# ZFS compressed, or some such!)\n#\n# The only assumption this script makes is that your tables are dated, and\n# the tablespace they're moving into already exists.\n#\n# A full example, using the syslog idea from above, where each child\n# table is date partitioned by a convention of \"syslog_YEAR-WEEKOFYEAR\":\n#\n# syslog # <--- parent\n# syslog_2012_06 # <--- inherited\n# syslog_2012_07 # <--- inherited\n# syslog_2012_08 # <--- inherited\n# ...\n#\n# You'd run this script like so:\n#\n# ./warehouse_partitions.rb -F syslog_%Y_%U\n#\n# Assuming this was week 12 of the year, tables syslog_2012_06 through\n# syslog_2012_11 would start sequentially migrating into the tablespace\n# called 'warehouse'." +"require File.expand_path('../../spec_helper', __FILE__)\n\ndescribe 'Project Dependency Patch' do\n\n before(:all) do\n Setting.cross_project_issue_relations = '1'\n\n # Issue 1 of Project 1 precedes issue 2 of the same project\n # Issue 1 precedes an issue of Project 2\n # Issue 2 precedes an issue of Project 3\n @first_project_issue1, @first_project_issue2 = create_related_issues(\"precedes\")\n @project1 = @first_project_issue1.project\n @project2, @project3 = Factory(:project), Factory(:project)\n @second_project_issue = Factory(:issue, :project => @project2)\n @third_project_issue = Factory(:issue, :project => @project3)\n create_related_issues(\"precedes\", @first_project_issue1, @second_project_issue)\n create_related_issues(\"precedes\", @first_project_issue2, @third_project_issue)\n end\n\n it 'should find cross project related issues' do\n @project1.cross_project_related_issues.count.should eql(2)\n @project1.cross_project_related_issues.should include(@second_project_issue, @third_project_issue)\n end\n\n it 'should find cross project related issues of other projects' do\n @project2.cross_project_related_issues.count.should eql(1)\n @project2.cross_project_related_issues.should include(@first_project_issue1)\n end\nend" +"Creating an updated ALEZ install ISO\n\nSometimes ALEZ will fail to install because the archlinux-keyring package included \non the ISO is outdated causing PGP verification of the Arch ZFS repo to fail. If \nthat is the case, you may need to create an updated ISO complete with the ZFS packages, \ngit and the ALEZ downloader. \n\nHere are the steps to manually create an updated ALEZ install ISO:\n\n* Boot into an up-to-date install of Arch (but not under LXD/LXC as the build script \ndoesn't work in a container) and run the following to install the Arch ISO build \nscripts and their dependencies (and git):\n\n# pacman -S archiso git\n\n* Now copy the releng profile directory:\n\n# cp -r /usr/share/archiso/configs/releng ~/aleziso\n\n* Edit ~/aleziso/packages.x86_64 and add the following lines:\n\ngit\narchzfs-linux\n\n* Edit ~/aleziso/pacman.conf and add the following lines:\n\n[archzfs]\nServer = http://archzfs.com/$repo/x86_64\n\n* Import and sign the archzfs repo key:\n\n# pacman-key -r F75D9D76 && pacman-key --lsign-key F75D9D76\n\n(NB You may want to check the key hasn't changed first by checking \nhttps://wiki.archlinux.org/index.php/Unofficial_user_repositories#archzfs )\n\n* Clone the ALEZ repo\n\ngit clone https://github.com/danboid/ALEZ.git\n\n* Make airootfs directory for alez downloader:\n\n# mkdir -p ~/aleziso/airootfs/usr/local/bin\n\n* Copy scripts into airootfs tree:\n\n#" +"class UWindowConsoleWindow extends UWindowFramedWindow;\n\nvar float OldParentWidth, OldParentHeight;\n\nfunction Created() \n{\n\tSuper.Created();\n\tbSizable = True;\n\tbStatusBar = True;\n\tbLeaveOnScreen = True;\n\n\tOldParentWidth = ParentWindow.WinWidth;\n\tOldParentHeight = ParentWindow.WinHeight;\n\n\tSetDimensions();\n\n\tSetAcceptsFocus();\n}\n\nfunction ShowWindow()\n{\n\tSuper.ShowWindow();\n\n\tif(ParentWindow.WinWidth != OldParentWidth || ParentWindow.WinHeight != OldParentHeight)\n\t{\n\t\tSetDimensions();\n\t\tOldParentWidth = ParentWindow.WinWidth;\n\t\tOldParentHeight = ParentWindow.WinHeight;\n\t}\n}\n\nfunction ResolutionChanged(float W, float H)\n{\n\tSetDimensions();\n}\n\nfunction SetDimensions()\n{\n\tif (ParentWindow.WinWidth < 500)\n\t{\n\t\tSetSize(200, 150);\n\t} else {\n\t\tSetSize(410, 310);\n\t}\n\tWinLeft = ParentWindow.WinWidth/2 - WinWidth/2;\n\tWinTop = ParentWindow.WinHeight/2 - WinHeight/2;\n}\n\nfunction Close(optional bool bByParent)\n{\n\tClientArea.Close(True);\n\tRoot.GotoState('');\n}\n\t\ndefaultproperties\n{\n\tWindowTitle=\"Game Console\";\n\tClientClass=class'UWindowConsoleClientWindow'\n}" +"---\nhandle: use-cases\ncanonical: https://maze.digital/webticker/use-cases/\ntitle: News Ticker, Stock Ticker & Image Ticker with jQuery Web Ticker\ndescription: A list of possible use cases of how you can use the jQuery Web Ticker, from news tickers to stock tickers, to image tickers.\n---\n\n\n##Use Cases\n\nBelow are examples of different uses for the web ticker.\n\n### Use Case 1: News Ticker {#news}\n\nNews changes fairly frequently, at times you may want to offer byte-sized headlines going across your website for your customers to knwo what is going on. The continous functionality of the WebTicker with the option to pause on hover is great to allow your users to interact with this content. Most importantly, all the items can be in themselves links, taking your users to the relevant articles.\n\n
\n\t
    \n\t\t
  • WebTicker v3.0.0 has just been released! Read the new documentation.
  • \t\n\t\t
  • WebTicker v3.0.0 now has commercial licenses available
  • \n\t\t
  • News can potentially be dynamically updated and WebTicker can do just that!
  • \n\t
\n
\n\n### Use Case 2: Stock Ticker {#finance}\n\nTick rates change and update all the time, Web Ticker is a great way to show updated pricing to your clients. Through the update facility" +"MODEL 1\nATOM 1 N POPE 1 -2.646 0.841 2.584 0.00 0.00 MEMB N \nATOM 2 HN1 POPE 1 -1.839 0.219 2.374 0.00 0.00 MEMB H \nATOM 3 HN2 POPE 1 -2.809 0.806 3.611 0.00 0.00 MEMB H \nATOM 4 HN3 POPE 1 -3.492 0.456 2.117 0.00 0.00 MEMB H \nATOM 5 C12 POPE 1 -2.396 2.208 2.114 0.00 0.00 MEMB C \nATOM 6 H12A POPE 1 -1.699 2.805 2.739 0.00 0.00 MEMB H \nATOM 7 H12B POPE 1 -3.395 2.694 2.098 0.00 0.00 MEMB H \nATOM 8 C11 POPE 1 -1.788 2.184 0.695 0.00 0.00 MEMB C \nATOM 9 H11A POPE 1 -1.801 3.137 0.125 0.00 0.00 MEMB H \nATOM 10 H11B POPE 1 -2.436 1.530 0.073 0.00 0.00 MEMB H \nATOM 11 P POPE 1 0.336 1.358 -0.715 0.00 0.00 MEMB P \nATOM 12 O13 POPE 1 1.704 0.829 -0.449 0.00 0.00 MEMB O \nATOM 13 O14 POPE 1 0.274 2.675 -1.415 0.00 0.00 MEMB O \nATOM 14 O11 POPE 1 -0.478 0.279 -1.466 0.00 0.00 MEMB O \nATOM 15 O12 POPE 1 -0.453 1.590 0.653 0.00 0.00 MEMB O \nATOM 16 C1 POPE 1 -0.424 -1.121 -1.064 0.00 0.00 MEMB C \nATOM 17 HA POPE 1 -1.234" +"package ratelimiter\n\nimport (\n\t\"time\"\n)\n\n// RateLimiter is a simple rate-limiter for any resources inspired by Cloudflare's approach: https://blog.cloudflare.com/counting-things-a-lot-of-different-things/\ntype RateLimiter struct {\n\tdataStore LimitStore\n\trequestsLimit int64\n\twindowSize time.Duration\n}\n\n// New creates new rate limiter. A dataStore is internal limiter data store, requestsLimit and windowSize are parameters of limiter e.g. requestsLimit: 5 and windowSize: 1*time.Minute means that limiter allows up to 5 requests per minute\nfunc New(dataStore LimitStore, requestsLimit int64, windowSize time.Duration) *RateLimiter {\n\n\treturn &RateLimiter{\n\t\tdataStore: dataStore,\n\t\trequestsLimit: requestsLimit,\n\t\twindowSize: windowSize,\n\t}\n}\n\n// Inc increments limiter counter for a given key or returns error when it's not possible\nfunc (r *RateLimiter) Inc(key string) error {\n\tcurrentWindow := time.Now().UTC().Truncate(r.windowSize)\n\treturn r.dataStore.Inc(key, currentWindow)\n}\n\n// LimitStatus represents current status of limitation for a given key\ntype LimitStatus struct {\n\t// IsLimited is true when a given key should be rate-limited\n\tIsLimited bool\n\t// LimitDuration is not nil when IsLimited is true. It's the time for which a given key should be blocked before CurrentRate falls below declared in constructor requests limit\n\tLimitDuration *time.Duration\n\t// CurrentRate is approximated current requests rate per window size (declared in the constructor)\n\tCurrentRate float64\n}\n\n// Check checks status of rate-limiting for a" +"module Lowkiq\n class Server\n def self.build(options)\n require options[:require]\n Lowkiq.on_server_init.call\n\n splitter = Lowkiq.build_splitter.call\n shard_handlers_by_thread = splitter.call Lowkiq.shard_handlers\n scheduler = Lowkiq.build_scheduler.call\n new shard_handlers_by_thread, scheduler\n end\n\n def initialize(shard_handlers_by_thread, scheduler)\n @shard_handlers_by_thread = shard_handlers_by_thread\n @scheduler = scheduler\n @threads = []\n end\n\n def start\n Lowkiq.server_redis_pool.with do |redis|\n Script.load! redis\n end\n\n @shard_handlers_by_thread.each do |handlers|\n handlers.each(&:restore)\n end\n\n @threads = @shard_handlers_by_thread.map do |handlers|\n job = @scheduler.build_job handlers\n Thread.new do\n job.call until exit_from_thread?\n end\n end\n end\n\n def stop\n @stopped = true\n end\n\n def join\n @threads.each(&:join)\n end\n\n def exit_from_thread?\n stopped? || failed?\n end\n\n def stopped?\n @stopped\n end\n\n def failed?\n @threads.map(&:status).any? do |status|\n status != \"run\" && status != \"sleep\"\n end\n end\n end\nend" +"---\nlayout: pattern\ntitle: Feature Toggle\nfolder: feature-toggle\npermalink: /patterns/feature-toggle/\npumlid: NSZ14G8X30NGLhG0oDrk8XjPd12OvCTjNy_UthpxiAPvIBhUJc37WyZvgdtWp6U6U5i6CTIs9WtDYy5ER_vmEIH6jx8P4BUWoV43lOIHBWMhTnKIjB-gwRFkdFe5\ncategories: Behavioral\ntags:\n - Java\n - Difficulty-Beginner\n---\n\n## Also known as\nFeature Flag\n\n## Intent\nUsed to switch code execution paths based on properties or groupings. Allowing new features to be released, tested\nand rolled out. Allowing switching back to the older feature quickly if needed. It should be noted that this pattern,\ncan easily introduce code complexity. There is also cause for concern that the old feature that the toggle is eventually\ngoing to phase out is never removed, causing redundant code smells and increased maintainability.\n\n![alt text](./etc/feature-toggle.png \"Feature Toggle\")\n\n## Applicability\nUse the Feature Toogle pattern when\n\n* Giving different features to different users.\n* Rolling out a new feature incrementally.\n* Switching between development and production environments.\n\n## Credits\n\n* [Martin Fowler 29 October 2010 (2010-10-29).](http://martinfowler.com/bliki/FeatureToggle.html)" +"syntax = \"proto3\";\npackage turbo;\n\nmessage TestPrimitives {\n\tint64 Int64Value = 1;\n\tsint32 Int32Value = 2;\n\tuint64 Uint64Value = 3;\n\tuint32 Uint32Value = 4;\n\tfloat Float32Value = 5;\n\tdouble Float64Value = 6;\n\tbool BoolValue = 7;\n}\n\nmessage TestProtoStruct {\n\tint64 Value = 1;\n}\n\nmessage TestTags {\n\tTestTagsData Data = 1;\n}\n\nmessage TestTagsData {\n\tstring UploadFile = 1;\n\tstring UploadUrl = 2;\n\tstring MetadataOnly = 3;\n\tint64 ContentTypeId = 4;\n\tint64 CreativeApiId = 5;\n\tsint32 Duration = 6;\n\tfloat PhysicalDuration = 7;\n\tsint32 Bitrate = 8;\n\tsint32 Height = 9;\n\tsint32 Width = 10;\n\tfloat Fps = 11;\n\tstring Id3Tag = 12;\n}" +"# -*- coding: utf-8 -*-\n\"\"\"A X509Adapter for use with the requests library.\n\nThis file contains an implementation of the X509Adapter that will\nallow users to authenticate a request using an arbitrary\nX.509 certificate without needing to convert it to a .pem file\n\n\"\"\"\n\nfrom OpenSSL.crypto import PKey, X509\nfrom cryptography import x509\nfrom cryptography.hazmat.primitives.serialization import (load_pem_private_key,\n load_der_private_key)\nfrom cryptography.hazmat.primitives.serialization import Encoding\nfrom cryptography.hazmat.backends import default_backend\n\nfrom datetime import datetime\nfrom requests.adapters import HTTPAdapter\nimport requests\n\nfrom .._compat import PyOpenSSLContext\nfrom .. import exceptions as exc\n\n\"\"\"\nimporting the protocol constants from _ssl instead of ssl because only the\nconstants are needed and to handle issues caused by importing from ssl on\nthe 2.7.x line.\n\"\"\"\ntry:\n from _ssl import PROTOCOL_TLS as PROTOCOL\nexcept ImportError:\n from _ssl import PROTOCOL_SSLv23 as PROTOCOL\n\n\nclass X509Adapter(HTTPAdapter):\n r\"\"\"Adapter for use with X.509 certificates.\n\n Provides an interface for Requests sessions to contact HTTPS urls and\n authenticate with an X.509 cert by implementing the Transport Adapter\n interface. This class will need to be manually instantiated and mounted\n to the session\n\n :param pool_connections: The number of urllib3 connection pools to\n cache.\n :param pool_maxsize: The maximum number of connections to save in the\n pool.\n :param max_retries: The maximum" +"# Notes on the wolfssl-fips project\n\nFirst, if you did not get the FIPS files with your archive, you must contact\nwolfSSL to obtain them.\n\n\n# Building the wolfssl-fips project\n\nThe wolfCrypt FIPS library for Windows is a part of the wolfSSL library. It\nmust be built as a static library, for the moment.\n\nThe library project is built with Whole Program Optimization disabled. This is\nrequired so that necessary components of the library are not optimized away.\nThere are two functions added to the library that are used as markers in\nmemory for the in-core memory check of the code. WPO consolidates them into a\nsingle function. WPO also optimizes away the automatic FIPS entry function.\n\nEach of the source files inside the FIPS boundary defines their own code and\nconstant section. The code section names start with \".fipsA$\" and the constant\nsection names start with \".fipsB$\". Each subsection has a letter to organize\nthem in a secific order. This specific ordering puts marker functions and\nconstants on either end of the boundary so it can be hashed.\n\n\n# In Core Memory Test\n\nThe In Core Memory test calculates a checksum (HMAC-SHA256) of the wolfCrypt\nFIPS library code and" +"#title The Borqs Alchemy Library Usage Manual\n#author Bao Haojun \n\n\nCopyright Borqs, Inc. 2009, All Rights Reserved\n\n\n* Introduction\n\nThis user manual corresponds to Borqs Alchemy Library version\n1.0.6. This is the library to be used for testing the OMS phone. The\nmain functions of this library include:\n - Initialization. Connection to the phone is established through\n usbnet; connection status callback and logging callback functions\n are saved.\n - Execution. Test commands can be sent to the phone. Refer to the\n `TCMD Reference Manual' for a complete detailed list of supported\n commands.\n - Help system. A simple help system is built-in in this library, it\n will print a simplified list of the *TCMD* commands supported by this\n library.\n\nThree files are provided for development, together with a\nsample console project. The 3 development files are:\n 1. =libalchemy.h=, the header file used for compilation\n 2. =alchemylib.lib=, the .lib file used for linking\n 3. =alchemylib.dll=, the .dll file used for running your program.\n\n\n** How to use the library\n\nYou can develop either win32 or console applications using this\nlibrary. MFC must be used, and it must be linked as a shared DLL. \n - If MFC is not used, compilation will" +"#include \"9cc.h\"\n\n// This is a recursive-descendent parser which constructs abstract\n// syntax tree from input tokens.\n//\n// Variable names are resolved at this stage. We create a Var object\n// when we see a variable definition and use it when we see a variable\n// reference.\n//\n// Types are added to variables and literals. For other nodes, Sema\n// will add type for them.\n//\n// Semantic checking is omitted from this parser to make the code in\n// this file closely resemble the C's BNF. Invalid expressions, such\n// as `1+2=3`, are accepted at this stage. Such errors are detected in\n// a later pass.\n\nint nlabel = 1;\n\ntypedef struct Env {\n Map *vars;\n Map *typedefs;\n Map *tags;\n struct Env *prev;\n} Env;\n\nstatic Program *prog;\nstatic Vector *lvars;\nstatic Vector *breaks;\nstatic Vector *continues;\nstatic Vector *switches;\n\nstatic Vector *tokens;\nstatic int pos;\nstruct Env *env;\n\nstatic Node null_stmt = {ND_NULL};\n\nstatic Env *new_env(Env *prev) {\n Env *env = calloc(1, sizeof(Env));\n env->vars = new_map();\n env->typedefs = new_map();\n env->tags = new_map();\n env->prev = prev;\n return env;\n}\n\nstatic Var *find_var(char *name) {\n for (Env *e = env; e; e = e->prev) {\n Var *var =" +"# uraster\n\nMicro simple software rasterizer in a single C++11 header file. Does not use OpenGL. Pure C++11. \n\nMostly useful as a way of teaching how the rendering pipeline in hardware works.\n\n![Image from uraster](https://raw.githubusercontent.com/Steve132/uraster/master/example/screenshot.jpg)\n\nA day ago, I saw this post [Asking how to create a software renderer](http://www.reddit.com/r/GraphicsProgramming/comments/2whjam/creating_a_software_renderer/).\n\nI started writing my response, and it looked something like this:\n\n> One should not create a software rasterizer unless they are giant masochists. However, if you were to do it, and first you need to ....\n \nAfter a while I stopped. After writing my tutorial on how to make a C++ rasterizer in english, I realized it was nearly unreadable.\n\nNew idea: I'll just write the tutorial in pseudocode. Then, I looked at the pseudocode. It was unreadable because it wasn't *quite* a language\n\nNew idea: I'll write the skeleton of the header with comments explaining what to put in the code! I did this.\n\nSurely you can see where this is going. \n\nI ended up actually filling in the code, and implementing a simple rasterization pipeline all in one C++11 header standalone header file. The interface is similar to a dramatically simplified version of OpenGL2.0 with shader support, that" +"C> \\ingroup nwdft\nC> @{\nC>\nC> \\file occup_input.F\nC> Read the occupation numbers\nC>\nC> @}\nC>\nC> \\ingroup nwdft\nC> @{\nC>\nC> \\brief Orbital occupations input reader\nC>\nC> Read the orbital occupation numbers either from the input or from\nC> a file. An example of the input block is\nC>\nC> \\code\nC> occup\nC> 5 3\nC> 1.0 1.0\nC> 1.0 1.0\nC> 1.0 1.0\nC> 1.0\nC> 1.0\nC> end\nC> \\endcode\nC>\nC> The first line with two integers specifies how many occupation\nC> numbers there are to read for each spin channel. Next there are\nC> a number of lines specifying the orbital occupations.\nC>\nC> Similary for reading the occupation numbers from a file\nC>\nC> \\code\nC> occup\nC> 5 3\nC> load file.occup\nC> end\nC> \\endcode\nC>\nC> After reading the occupation numbers they are stored on the \nC> runtime database in the fields:\nC>\nC> - `focc:occup_switch` -- there was an \"occup\" input block\nC>\nC> - `focc:occup` -- the number of alpha- and beta-occupation numbers\nC>\nC> - `focc:occup_list` -- the list of occupation numbers\nC>\nC> The occupation number list is stored essentially as read." +"- name: IRPP - D\u00e9ficits des revenus de capitaux mobiliers - C\u00e9libataire pour un revenu salarial de 20 000 \u20ac\n keywords: rcm\n period: 2009\n absolute_error_margin: 0.5\n input:\n salaire_imposable: 20000\n f2dc: 5000\n f2aa: 1000\n f2al: 1000\n f2am: 1000\n f2an: 1000\n f2aq: 1000\n f2ar: 1000\n output:\n irpp: -1086\n- name: IRPP - D\u00e9ficits des revenus de capitaux mobiliers - C\u00e9libataire pour un revenu salarial de 20 000 \u20ac\n keywords: rcm\n period: 2010\n absolute_error_margin: 0.5\n input:\n salaire_imposable: 20000\n f2dc: 5000\n f2aa: 1000\n f2al: 1000\n f2am: 1000\n f2an: 1000\n f2aq: 1000\n f2ar: 1000\n output:\n irpp: -1181\n- name: IRPP - D\u00e9ficits des revenus de capitaux mobiliers - C\u00e9libataire pour un revenu salarial de 20 000 \u20ac\n keywords: rcm\n period: 2011\n absolute_error_margin: 0.5\n input:\n salaire_imposable: 20000\n f2dc: 5000\n f2aa: 1000\n f2al: 1000\n f2am: 1000\n f2an: 1000\n f2aq: 1000\n f2ar: 1000\n output:\n irpp: -1181\n- name: IRPP - D\u00e9ficits des revenus de capitaux mobiliers - C\u00e9libataire pour un revenu salarial de 20 000 \u20ac\n keywords: rcm\n period: 2012\n absolute_error_margin: 0.5\n input:\n salaire_imposable: 20000\n f2dc: 5000\n f2aa: 1000\n f2al: 1000\n f2am: 1000\n f2an: 1000\n f2aq: 1000\n f2ar: 1000\n output:\n irpp: -1181\n- name: IRPP - D\u00e9ficits des revenus de capitaux mobiliers - C\u00e9libataire pour un revenu salarial" +"Revolver Cartridges\n\nCartridges are created in the Engineer's Workbench, by the use of specific . Blueprints for normal ammunition you can create yourself, but some more advanced types will require you to buy adequate blueprints from someone else.\n<&empty>\u00a7lEmpty \u00a7lCasings\u00a7r and \u00a7lShells\u00a7r are the basis for every projectile. Empty Casings can also be crafted in the , using the appropriate mold, at a greatly reduced cost.\n<&casull>\u00a7lCasull \u00a7lCartridges\u00a7r are created with a \"Common Projectiles\" blueprint. They are a simple lead projectile inflicting medium damage.\n<&ap>\u00a7lArmor-Piercing \u00a7lCartridges\u00a7r are created with the same blueprint. They do the same amount of damage but penetrate armor by sheer kinetic force.\n<&buckshot>\u00a7lBuckshot \u00a7lCartridges\u00a7r, created with the aformentioned blueprint, are filled with multiple small projectiles that are fired out in a cone shape. This cartridge is highly effective at short range.\n<&he>\u00a7lHigh-Explosive \u00a7lCartridges\u00a7r are created with a \"Common Projectiles\" blueprint. They fire a projectile that explodes on impact, causing damage to living creatures and terrain.\n<&silver>\u00a7lSilver \u00a7lCartridges\u00a7r, also created with a \"Common Projectiles\" blueprint, consist of a lead core wrapped in silver, making them highly effective against undead.\n<&dragon>\u00a7lDragon's \u00a7lBreath Cartridges\u00a7r are created with a \"Specialized Projectiles\" blueprint. Similarly to Buckshot, they are effective" +"/*\n * make -- maintain program groups\n * td 80.09.17\n * things done:\n *\t20-Oct-82\tMade nextc properly translate \"\\\\\\n[ \t]*\" to ' '.\n *\t15-Jan-85\tMade portable enough for z-8000, readable enough for\n *\t\t\thuman beings.\n *\t06-Nov-85\tAdded free(t) to make() to free used space.\n *\t07-Nov-85\tModified docmd() to call varexp() only if 'cmd'\n *\t\t\tactually contains macros, for efficiency.\n *\t24-Feb-86\tMinor fixes by rec to many things. Deferred\n *\t\t\tmacro expansion in actions until needed, deferred\n *\t\t\tgetmdate() until needed, added canonicalization in\n *\t\t\tarchive searches, allowed ${NAME} in actions for\n *\t\t\tshell to expand, put macro definitions in malloc,\n *\t\t\tentered environ into macros.\n *\t17-Oct-86\tVery minor MS-DOS changes by steve: add _cmdname[],\n *\t\t\tconditionalize archive code as #if COHERENT || GEMDOS.\n *\t 8-Dec-86\tRec makes inpath() search current directory first,\n *\t\t\tand allows : in dependency list under MSDOS && GEMDOS.\n *\t 8-Feb-91\tsteve: fix comment handling, allow \"\\#\", allow ${VAR}.\n *\t\t\tAdd docmd0() to make $< and $* work in Makefiles.\n *\t12-Feb-91\tsteve: add $SRCPATH source path searching.\n *\t 1-Nov-91\tsteve: fix bug in nextc() to handle \"\\n\\t\\n\" correctly\n * 29-Sep-92\tmichael: fix problem with defining a rule that also" +"+++\ntitle = \"Reports talk:Tasks not implemented in FreeBASIC\"\ndescription = \"\"\ndate = 2016-10-04T22:49:35Z\naliases = []\n[extra]\nid = 21145\n[taxonomies]\ncategories = []\ntags = []\n+++\n\nThis list seems to be rather old (last edited in August 2014). Who can update it? Or will it be updated automatically? --[[User:Lothar Schirm|Lothar Schirm]], 04 October 2016\n\nThe lists update automatic when there is a change.\nBut you must use the correct way to write FreeBASIC in the header tag is FreeBASIC, if it's not correct it will be in red in the list, meaning there is no page for that name. If it's correct it will be in blue and the link to the FreeBASIC page will work. If correct all the list's will be updated (most of the time whitin a minute). The lang tag is not case sensitive, I write it as freebasic (lowercase).--[[User:Frisian|Frisian]] ([[User talk:Frisian|talk]]) 22:49, 4 October 2016 (UTC)" +"---\nauthors: Josh Wilsdon , Angela Fong \nstate: publish\n---\n\n# RFD 18 Support for using labels to select networks and packages\n\n## Current support for packages/networks in sdc-docker\n\nCurrently a user creating a docker container does not select their networks in\ntheir `docker run` commandline though they can specify the -P in order to get a\nan external NIC. The external NIC is based on the default network configured\nfor the data center. The fabric network they're given is always the network\nconfigured in UFDS as the default for their account.\n\nUsers also do not directly select their package. The package is chosen based on\nthe `-m` parameter if passed (if not, we treat as 1024m). We'll find the\nsmallest package that fits their memory limit, and since all the packages have\nthe same ratios of cpu/disk/memory all the other parameters are scaled\nappropriately.\n\n\n## Impetus\n\nIn order to allow customers to specify non-default networks, we'd like the\nability for them to add these at container creation. This will allow them to use\ndifferent network setups for different containers without needing to interact\nwith cloudapi, except to create and manage the networks themselves.\n\nIn order to support SDC" +"# Submit an issue\n\nNote that any underlying issues with the content shared in this repository can only be addressed by the original owner and should be reported directly. Please only create an issue here if the content shared in this repository needs to be re-evaluated by Adobe.\n\n## Topic\n\nThis is an issue regarding:\n\n- [ ] The XD Awesome repository README.\n- [ ] The open source plugins shared within this repo.\n- [ ] The utility libraries shared within this repo.\n- [ ] The developer tools shared within this repo.\n- [ ] Other\n\n## Description of the issue" +"http://burmese.voanews.com/a/myanmar-ambassador-of-thailand-said-they-will-appeal-the-case-according-to-the-thai-law/3124176.html\n\n\u101c\u102d\u1015\u1039\u1000\u107d\u103c\u1014\u1039\u1038\u1021\u1019\u1088 \u1021\u101a\u1030\u1001\u1036\u1040\u1004\u1039\u1016\u102d\u102f\u1094 \u103b\u1015\u1004\u1039\u1006\u1004\u1039\n\n\u107f\u1017\u102d\u1010\u102d\u1014\u1039\u108f\u102f\u102d\u1004\u1039\u1004\u1036\u101e\u102c\u1038 \u1000\u1019\u102c\u107b\u101c\u103d\u100a\u1039\u1037\u1001\u101b\u102e\u1038\u101e\u100a\u1039\u108f\u103d\u1005\u1039\u1025\u102e\u1038\u1000\u102d\u102f \u1011\u102d\u102f\u1004\u1039\u1038\u108f\u102f\u102d\u1004\u1039\u1004\u1036 \u1031\u1000\u102c\u1037\u1031\u1010\u102c\u1004\u1039 \u1021\u1015\u1014\u1039\u1038\u1031\u103b\u1016\u1000\u107d\u103c\u1014\u1039\u1038\u1019\u103d\u102c \u101e\u1010\u1039\u1001\u1032\u1037\u1010\u101a\u1039\u1006\u102d\u102f\u1010\u1032\u1037\u1005\u103c\u1032\u1001\u103a\u1000\u1039\u1014\u1032\u1094 \u1031\u101e\u1012\u100f\u1039 \u1001\u103a\u1001\u1036\u1011\u102c\u1038\u101b\u1010\u1032\u1037 \u103b\u1019\u1014\u1039\u1019\u102c\u108f\u102f\u102d\u1004\u1039\u1004\u1036\u101e\u102c\u1038\u108f\u103d\u1005\u1039\u1025\u102e\u1038\u101b\u1032 \u1094\u1021\u1019\u1088\u1000\u102d\u102f \u1011\u102d\u102f\u1004\u1039\u1038\u108f\u102f\u102d\u1004\u1039\u1004\u1036\u101b\u1032\u1095 \u1010\u101b\u102c\u1038\u1025\u1015\u1031\u1012\u1021\u1010\u102d\u102f\u1004\u1039\u1038 \u1006\u1000\u1039\u107f\u1015\u102e\u1038 \u1021\u101a\u1030\u1001\u1036\u1040\u1004\u1039\u101e\u103c\u102c\u1038\u108f\u102d\u102f\u1004\u1039\u1016\u102d\u102f\u1094\u1021\u1010\u103c\u1000\u1039 \u1011\u102f\u102d\u1004\u1039\u1038\u108f\u102d\u102f\u1004\u1039\u1004\u1036 \u1031\u101b\u103d \u1094\u1031\u1014\u1019\u103a\u102c\u1038\u1031\u1000\u102c\u1004\u1039\u1005\u102e\u1014\u1032\u1094 \u1018\u1014\u1039\u1031\u1000\u102c\u1000\u1039\u107f\u1019\u102d\u1033 \u1094\u1000 \u103b\u1019\u1014\u1039\u1019\u102c\u101e\u1036\u1021\u1019\u1010\u1039\u1015\u102b\u1040\u1004\u1039\u1010\u1032\u1037 \u101c\u102d\u1015\u1039\u1000\u107d\u103c\u1014\u1039\u1038\u1021\u1019\u1088\u1021\u1010\u103c\u1000\u1039 \u101c\u102d\u102f\u1000\u1039\u1015\u102b\u1031\u1006\u102c\u1004\u1039\u101b\u103c\u1000\u1039\u1031\u1014\u1010\u1032\u1037 \u103b\u1019\u1014\u1039\u1019\u102c\u101e\u1036\u101b\u1036\u102f\u1038\u1021\u1011\u1030\u1038\u1031\u1000\u102c\u1039\u1019\u1010\u102e\u1010\u102d\u102f\u1094 \u1031\u1010\u103c\u1006\u1036\u102f\u1001\u1032\u1037\u107f\u1015\u102e\u1038 \u101e\u1000\u1039\u1031\u101e\u1021\u1031\u1011\u102c\u1000\u1039\u1021\u1011\u102c\u1038 \u1021\u1001\u103a\u1000\u1039\u1021\u101c\u1000\u1039 \u1005\u102f\u1031\u1006\u102c\u1004\u1039\u1038\u1016\u102d\u102f\u1094 \u1021\u1010\u103c\u1000\u1039\u103b\u1015\u1004\u1039\u1006\u1004\u1039\u1031\u1014\u107e\u1000\u1015\u102b\u1010\u101a\u1039\u104b \u1019\u1031\u1021\u1038\u1031\u1021\u1038\u1019\u102c\u1000\u101e\u1010\u1004\u1039\u1038\u1031\u1015\u1038\u1015\u102d\u102f\u1094\u1011\u102c\u1038\u1015\u102b\u1010\u101a\u1039 \u104b\n\n\u107f\u1017\u102d\u1010\u102d\u1014\u1039\u108f\u102f\u102d\u1004\u1039\u1004\u1036\u101e\u102c\u1038 \u1000\u1019\u102c\u107b\u101c\u103d\u100a\u1039\u1037\u1001\u101b\u102e\u1038\u101e\u100a\u1039\u108f\u103d\u1005\u1039\u1025\u102e\u1038\u1000\u102d\u102f \u1011\u102d\u102f\u1004\u1039\u1038\u108f\u102f\u102d\u1004\u1039\u1004\u1036 \u1031\u1000\u102c\u1037\u1031\u1010\u102c\u1004\u1039 \u1021\u1015\u1014\u1039\u1038\u1031\u103b\u1016\u1000\u107d\u103c\u1014\u1039\u1038\u1019\u103d\u102c \u101e\u1010\u1039\u1001\u1032\u1037\u1010\u101a\u1039\u1006\u102d\u102f\u1010\u1032\u1037 \u1005\u103c\u1032\u1001\u103a\u1000\u1039\u1014\u1032\u1094 \u1031\u101e\u1012\u100f\u1039 \u1001\u103a\u1001\u1036\u1011\u102c\u1038\u101b\u1010\u1032\u1037 \u103b\u1019\u1014\u1039\u1019\u102c\u108f\u102f\u102d\u1004\u1039\u1004\u1036\u101e\u102c\u1038 \u108f\u103d\u1005\u1039\u1025\u102e\u1038\u101b\u1032 \u1094 \u1021\u1019\u1088\u1014\u1032\u1094 \u1015\u1010\u1039\u101e\u1000\u1039\u101c\u102f\u102d\u1094\u101c\u102d\u1015\u1039\u1000\u107d\u103c\u1014\u1039\u1038\u1021\u1019\u1088\u1021\u1010\u103c\u1000\u1039 \u1016\u1032\u103c\u1094\u1005\u100a\u1039\u1038\u1011\u102c\u1038\u1010\u1032\u1037 \u103b\u1019\u1014\u1039\u1019\u102c\u101e\u1036\u101b\u1036\u102f\u1038\u1021\u1011\u1030\u1038\u1031\u1000\u102c\u1039\u1019\u1010\u102e\u1040\u1004\u1039\u1031\u1010\u103c \u1014\u1032\u1094 \u1011\u102f\u102d\u1004\u1039\u1038\u1031\u101b\u103d\u1095\u1031\u1014\u1019\u103a\u102c\u1038\u1031\u1000\u102c\u1004\u1039\u1005\u102e\u1014\u1032\u1094 \u1010\u102d\u102f\u1004\u1039\u1015\u1004\u1039\u1031\u1006\u103c\u1038\u1031\u108f\u103c\u1038\u1019\u1088\u1031\u1010\u103c\u101c\u102f\u1015\u1039\u1001\u1032\u1037\u1021\u107f\u1015\u102e\u1038\u1019\u103d\u102c\u1031\u1010\u102c\u1037 \u103b\u1019\u1014\u1039\u1019\u102c\u101e\u1036\u1021\u1019\u1010\u1039\u1025\u102e\u1038\u1040\u1004\u1039\u1038\u1031\u1019\u102c\u1004\u1039\u1000 \u1011\u102f\u102d\u1004\u1039\u1038\u108f\u102d\u102f\u1004\u1039\u1004\u1036\u101b\u1032\u1095 \u1010\u101b\u102c\u1038\u1025\u1015\u1031\u1012\u1021\u1010\u102f\u102d\u1004\u1039\u1038\u1006\u1000\u1039\u107f\u1015\u102e\u1038 \u1021\u101a\u1030\u1001\u1036\u1040\u1004\u1039\u101e\u103c\u102c\u1038\u1019\u101a\u1039\u101c\u102f\u102d\u1094 \u1031\u103b\u1015\u102c\u1006\u102f\u102d\u101e\u103c\u102c\u1038\u1015\u102b\u1010\u101a\u1039\u104b\n\n\u201c\u1011\u102d\u102f\u1004\u1039\u1038\u108f\u102d\u102f\u1004\u1039\u1004\u1036\u1021\u1005\u102d\u102f\u1038\u101b\u1014\u1032\u1094 \u1011\u102d\u102f\u1004\u1039\u1038\u108f\u102d\u102f\u1004\u1039\u1004\u1036\u101b\u1032\u1095 \u1025\u1015\u1031\u1012\u1021\u101b \u1011\u102d\u102f\u1004\u1039\u1038\u1040\u1014\u1039\u1080\u1000\u102e\u1038\u1001\u103a\u1033\u1015\u1039\u1000\u102d\u102f\u101a\u1039\u1010\u102d\u102f\u1004\u1039 \u1021\u101a\u1030\u1001\u1036\u1040\u1004\u1039\u108f\u102d\u102f\u1004\u1039\u1031\u107e\u1000\u102c\u1004\u1039\u1038 \u101c\u1019\u1039\u1038\u1001\u1004\u1039\u1038\u1031\u1015\u1038\u1010\u1032\u1037\u1021\u1010\u103c\u1000\u1039 \u1000\u103a\u1031\u1014\u102c\u1039\u1010\u102d\u102f\u1094 \u1006\u1000\u1039\u107f\u1015\u102e\u1038 \u1021\u101a\u1030\u1001\u1036\u1040\u1004\u1039\u107f\u1015\u102e\u1038\u1031\u1010\u102c\u1037 \u1012\u102e\u1000\u1031\u101c\u1038\u1031\u1010\u103c\u101b\u1032\u1095\u1031\u1019\u103d\u103a\u102c\u1039\u101c\u1004\u1039\u1037\u1001\u103a\u1000\u1039\u1000\u102d\u102f \u1006\u1000\u1039\u101c\u1000\u1039\u107f\u1015\u102e\u1038 \u1031\u1006\u102c\u1004\u1039\u101b\u103c\u1000\u1039\u101e\u103c\u102c\u1038\u1019\u103d\u102c\u103b\u1016\u1005\u1039\u1015\u102b\u1010\u101a\u1039\u104b\u201d\n\n\u1011\u102d\u102f\u1004\u1039\u1038\u1031\u101b\u103d \u1094\u1031\u1014\u1019\u103a\u102c\u1038\u1031\u1000\u102c\u1004\u1039\u1005\u102e\u101f\u102c \u1031\u101b\u103d \u1094\u1031\u1014 \u1041\u1041 \u1025\u102e\u1038\u1015\u102b\u1021\u1016\u1032\u103c\u1095\u1014\u1032\u1094\u103b\u1019\u1014\u1039\u1019\u102c\u101c\u1030\u1004\u101a\u1039\u1042 \u1025\u102e\u1038\u101b\u1032 \u1094\u1021\u1019\u1030\u1000\u102f\u102d \u1021\u1001\u1019\u1032\u1037 \u1031\u101b\u103d \u1094\u1031\u1014 \u101c\u102d\u102f\u1000\u1039\u1031\u1015\u1038\u1001\u1032\u1037\u107f\u1015\u102e\u1038 \u1010\u101b\u102c\u1038\u1019\u103a\u103d\u1010\u1005\u103c\u102c \u1005\u102e\u101b\u1004\u1039\u108f\u102d\u102f\u1004\u1039\u1016\u102d\u102f\u1094\u1021\u1010\u103c\u1000\u1039 \u1025\u1015\u1031\u1012\u1031\u107e\u1000\u102c\u1004\u1039\u1038\u1021\u101b \u1021\u1000\u1030\u100a\u102e\u1031\u1015\u1038\u1031\u1014\u1001\u1032\u1037\u1010\u102c\u1015\u102b\u104b \u101e\u1030\u1010\u102d\u102f\u1021\u1031\u1014\u1014\u1032\u1094 \u1001\u102d\u102f\u1004\u1039\u1019\u102c\u1010\u1032\u1037 \u101e\u1000\u1039\u1031\u101e \u1021\u1031\u1011\u102c\u1000\u1039\u1021\u1011\u102c\u1038 \u1031\u1010\u103c\u101c\u1036\u102f\u1031\u101c\u102c\u1000\u1039\u1005\u103c\u102c\u1019\u101b\u1001\u1032\u1037\u1010\u102c\u1031\u107e\u1000\u102c\u1004\u1037\u1039 \u1012\u102e\u1021\u1019\u1030\u1000\u102d\u102f \u101c\u102f\u102d\u1000\u1039\u1015\u102b\u1031\u1006\u102c\u1004\u1039\u101b\u103c\u1000\u1039\u1010\u1032\u1037\u1021\u1001\u102b\u1019\u103d\u102c \u101c\u102d\u102f\u1021\u1015\u1039\u1001\u103a\u1000\u1039\u1031\u1010\u103c\u101c\u100a\u1039\u1038\u101b\u103d\u102d\u1031\u1014\u1001\u1032\u1037\u1010\u101a\u1039\u101c\u102d\u102f\u1094 \u1006\u102d\u102f\u1015\u102b\u1010\u101a\u1039 \u104b \u1012\u102e\u1000\u1031\u1014\u1094\u1019\u103d\u102c\u1031\u1010\u102c\u1037 \u1011\u102d\u102f\u1004\u1039\u1038\u1031\u101b\u103d \u1094\u1031\u1014\u1019\u103a\u102c\u1038\u1031\u1000\u102c\u1004\u1039\u1005\u102e\u1000 \u103b\u1019\u1014\u1039\u1019\u102c\u101e\u1036\u1021\u1019\u1010\u1039\u1014\u1032\u1094 \u101c\u102d\u1015\u1039\u1000\u107d\u103c\u1014\u1039\u1038 \u1021\u1010\u103c\u1000\u1039 \u1016\u1032\u103c\u1005\u100a\u1039\u1038\u1011\u102c\u1038\u1010\u1032\u1037 \u103b\u1019\u1014\u1039\u1019\u102c\u101e\u1036\u101b\u1036\u102f\u1038\u1021\u1011\u1030\u1038\u1031\u1000\u102c\u1039\u1019\u1010\u102e\u1014\u1032\u1094 \u1031\u1010\u103c\u1006\u1036\u102f\u107f\u1015\u102e\u1038 \u103b\u1019\u1014\u1039\u1019\u102c\u101c\u1030\u1004\u101a\u1039\u1042 \u1025\u102e\u1038 \u1021\u1010\u103c\u1000\u1039 \u1021\u101a\u1030\u1001\u1036\u1006\u1000\u1039\u1010\u1000\u1039\u108f\u102d\u102f\u1004\u1039\u1016\u102d\u102f\u1094\u1025\u1015\u1031\u1012\u1031\u107e\u1000\u102c\u1004\u1039\u1038\u1021\u101b \u1006\u1000\u1039\u101c\u1000\u1039 \u101c\u102f\u1015\u1039\u1031\u1006\u102c\u1004\u1039\u108f\u102d\u102f\u1039\u1004\u1039\u1010\u1032\u1037\u1021\u1001\u103a\u1000\u1039\u1031\u1010\u103c\u1000\u102f\u102d \u1031\u1006\u103c\u1038\u1031\u108f\u103c\u1038\u1031\u103b\u1015\u102c\u1006\u102d\u102f\u1001\u1032\u1037\u1010\u101a\u1039\u1039\u101c\u102d\u102f \u101e\u1036\u101b\u1036\u102f\u1038\u1021\u1011\u1030\u1038\u1031\u1000\u102c\u1039\u1019\u1010\u102e\u1031\u1001\u102b\u1004\u1039\u1038\u1031\u1006\u102c\u1004\u1039\u103b\u1016\u1005\u1039\u1010\u1032\u1037 \u1025\u102e\u1038\u1031\u1000\u103a\u102c\u1039\u1031\u101e\u102c\u1004\u1039\u1038\u1000 \u1031\u103b\u1015\u102c\u1015\u102b\u1010\u101a\u1039\u104b\n\n\u201c\u1031\u101b\u103d \u1094\u1031\u1014\u1019\u103a\u102c\u1038\u1031\u1000\u102c\u1004\u1039\u1005\u102e \u1025\u1000\u1060\u1092\u1000 Dej Udom Krairit \u1031\u1015\u102b\u1037\u1031\u1014\u102c\u1039\u104b \u101e\u1030\u1000\u1031\u103b\u1015\u102c\u1010\u102c\u1000 \u1021\u1001\u103a\u1000\u1039 \u1043 \u1001\u103a\u1000\u1039\u1031\u103b\u1015\u102c\u1010\u101a\u1039\u104b \u1021\u101a\u1030\u1001\u1036\u1000\u102d\u1005\u1065\u1019\u103d\u102c\u1010\u1032\u1037 \u1021\u1019\u1088\u1000\u102d\u102f \u1021\u1005\u1000\u1031\u1014\u1021\u1006\u1036\u102f\u1038\u1021\u1011\u102d \u103b\u1015\u1014\u1039\u107f\u1015\u102e\u1038\u1031\u1010\u102c\u1037 \u1005\u1005\u1039\u1031\u1006\u1038\u1031\u1015\u1038\u1016\u102d\u102f\u1094 \u1031\u1010\u102c\u1004\u1039\u1038\u1006\u102d\u102f\u108f\u102d\u102f\u1004\u1039\u1005\u101b\u102c\u1000\u102d\u1005\u1065\u101b\u103d\u102d\u1010\u101a\u1039\u1010\u1032\u1037\u104b \u1021\u1032\u1012\u102b\u1031\u1010\u103c\u1000\u1031\u1010\u102c\u1037 \u1018\u102c\u101c\u1032\u1006\u102d\u102f\u1031\u1010\u102c\u1037 \u1014\u1036\u1015\u1010\u1039 \u1041 \u101e\u1000\u1039\u1031\u101e\u1001\u1036\u1031\u1010\u103c\u101f\u102c \u101c\u103c\u1032\u1019\u103d\u102c\u1038\u1010\u101a\u1039\u104b \u101e\u1000\u1039\u1031\u101e\u1031\u1010\u103c\u101c\u100a\u1039\u1038\u101c\u103c\u1032\u1019\u103d\u102c\u1038\u1010\u101a\u1039\u1006\u102d\u102f\u107f\u1015\u102e\u1038\u1031\u1010\u102c\u1037 \u1000\u103a\u1031\u1014\u102c\u1039\u1010\u102d\u102f\u1094\u1018\u1000\u1039\u1000 \u1010\u101b\u102c\u1038\u1001\u1036\u1018\u1000\u1039\u1000 \u1031\u1001\u103a\u1015\u108f\u102d\u102f\u1004\u1039\u1010\u1032\u1037\u1021\u1001\u103a\u1000\u1039\u1021\u101c\u1000\u1039\u101b\u103d\u102d\u101b\u1004\u1039 \u1010\u1004\u1039\u103b\u1015\u101c\u102d\u102f\u1094\u101b\u1010\u101a\u1039\u1010\u1032\u1037\u104b \u1014\u1036\u1015\u1010\u1039 \u1042 \u1021\u1001\u103a\u1000\u1039\u1000\u1031\u1010\u102c\u1037 \u1015\u1005\u1065\u100a\u1039\u1038\u101e\u1000\u1039\u1031\u101e\u1031\u1010\u103c\u101f\u102c \u1021\u1010\u102f\u1021\u1015\u1031\u1010\u103c\u103b\u1016\u1005\u1039\u1031\u1014\u1010\u101a\u1039\u104b \u1021\u1005\u1005\u1039\u1021\u1019\u103d\u1014\u1039\u1019\u101f\u102f\u1010\u1039\u1018\u1030\u1038\u104b \u1031\u1014\u102c\u1000\u1039\u107f\u1015\u102e\u1038\u1031\u1010\u102c\u1037 \u1010\u1004\u1039\u103b\u1015\u1010\u1032\u1037\u1021\u1001\u103a\u1000\u1039\u1021\u101c\u1000\u1039\u1031\u1010\u103c\u1000\u101c\u100a\u1039\u1038 \u1021\u1019\u103d\u102c\u1038\u1021\u101a\u103c\u1004\u1039\u1038\u1031\u1010\u103c\u103b\u1016\u1005\u1039\u1031\u1014\u1010\u101a\u1039\u1006\u102d\u102f\u101c\u102d\u102f\u1094\u101b\u103d\u102d\u101b\u1004\u1039\u101c\u100a\u1039\u1038 \u103b\u1015\u1014\u1039\u107f\u1015\u102e\u1038\u1031\u1010\u102c\u1037\u1019\u103d \u1021\u1019\u1088\u1000\u102d\u102f \u1021\u1005\u1021\u1006\u1036\u102f\u1038\u103b\u1015\u1014\u1039\u107f\u1015\u102e\u1038\u1031\u1010\u102c\u1037 \u1005\u1005\u1039\u1031\u1006\u1038 \u101c\u102d\u102f\u1094\u101b\u1015\u102b\u1010\u101a\u1039\u104b \u1014\u1036\u1015\u1010\u1039 \u1043 \u1021\u1001\u103a\u1000\u1039\u1000 \u1011\u1015\u1039\u107f\u1015\u102e\u1038\u1031\u1010\u102c\u1037 \u1010\u1004\u1039\u103b\u1015\u108f\u102d\u102f\u1004\u1039\u1010\u1032\u1037 \u101e\u1000\u1039\u1031\u101e\u1031\u1010\u103c \u1015\u1005\u1065\u100a\u1039\u1038\u1031\u1010\u103c\u104a \u101e\u1000\u1039\u1031\u101e\u1000 \u101e\u1000\u1039\u1019\u1032\u1037\u104a\u101e\u1000\u1039\u101b\u103d\u102d \u101e\u1000\u1039\u1031\u101e\u1031\u1010\u103c \u101b\u103d\u102d\u1001\u1032\u1037\u101b\u1004\u1039 \u1012\u102e\u1021\u1001\u103a\u1000\u1039 \u1043 \u1001\u103a\u1000\u1039\u1011\u1032\u1000 \u1010\u1001\u103a\u1000\u1039\u1001\u103a\u1000\u1039 \u1000\u102d\u102f\u101a\u1039\u1037\u1018\u1000\u1039\u1000 \u1031\u1011\u102c\u1000\u1039\u103b\u1015\u108f\u102d\u102f\u1004\u1039\u1005\u101b\u102c \u1031\u103b\u1015\u102c\u108f\u102d\u102f\u1004\u1039\u1005\u101b\u102c\u101b\u103d\u102d\u101b\u1004\u1039 \u1012\u102e\u1021\u1019\u1088\u1000\u102d\u102f \u1021\u1005\u1021\u1006\u1036\u102f\u1038\u1011\u102d\u103b\u1015\u1014\u1039\u107f\u1015\u102e\u1038\u1031\u1010\u102c\u1037 \u1005\u1005\u1039\u1031\u1006\u1038\u1001\u103c\u1004\u1039\u1037\u1010\u1004\u1039\u101c\u102d\u102f\u1094\u101b\u1015\u102b\u1010\u101a\u1039\u1010\u1032\u1037\u104b \u1012\u102e\u1021\u1001\u103a\u1000\u1039 \u1043 \u1001\u103a\u1000\u1039\u1011\u1032\u1019\u103d\u102c \u1010\u1001\u102f\u1001\u102f\u1031\u1010\u102c\u1004\u1039\u101c\u100a\u1039\u1038\u1031\u1000\u102c\u1004\u1039\u1038 \u104a \u1021\u1001\u103a\u1000\u1039 \u1042 \u1001\u103a\u1000\u1039\u1031\u101e\u102c\u1039\u101c\u100a\u1039\u1038\u1031\u1000\u102c\u1004\u1039\u1038\u104a \u1021\u1001\u103a\u1000\u1039 \u1043 \u1001\u103a\u1000\u1039\u1031\u101e\u102c\u1039\u101c\u100a\u1039\u1038\u1031\u1000\u102c\u1004\u1039\u1038 \u104a \u1010\u1001\u102f\u1001\u102f\u101b\u103d\u102d\u1001\u1032\u1037\u101c\u102d\u102f\u1094\u101b\u103d\u102d\u101b\u1004\u1039 \u1019\u102d\u101e\u102c\u1038\u1005\u102f\u1040\u1004\u1039\u1010\u1025\u102e\u1038\u1025\u102e\u1038\u1000 \u103b\u1015\u1014\u1039\u107f\u1015\u102e\u1038\u1031\u1010\u102c\u1037 \u1012\u102e\u1021\u1019\u1088\u1000\u102d\u1005\u1065\u1000\u102d\u102f \u103b\u1015\u1014\u1039\u101c\u100a\u1039\u1005\u1005\u1039\u1031\u1006\u1038\u1031\u1015\u1038\u1016\u102d\u102f\u1094\u1021\u1010\u103c\u1000\u1039 \u1031\u1010\u102c\u1004\u1039\u1038\u1006\u102d\u102f\u101c\u102d\u102f\u1094\u101b\u1015\u102b\u1010\u101a\u1039\u1010\u1032\u1037\u104b\u201d\n\n\u101b\u1014\u1039\u1000\u102f\u1014\u1039\u1021\u1015\u102b\u1021\u1040\u1004\u1039 \u1014\u101a\u1039\u1005\u1015\u1039\u107f\u1019\u102d\u1033\u1095\u1031\u1010\u103c\u1019\u103d\u102c\u1031\u1010\u102c\u1037 \u103b\u1019\u1014\u1039\u1019\u102c\u108f\u102f\u102d\u1004\u1039\u1004\u1036\u101e\u102c\u1038\u101c\u1030\u1004\u101a\u1039 \u1042" +"#~# ORIGINAL\n\nx = 1\n xyz = 2\n\n w = 3\n\n#~# EXPECTED\nx = 1\nxyz = 2\n\nw = 3\n\n#~# ORIGINAL\n\nx = 1\n foo[bar] = 2\n\n w = 3\n\n#~# EXPECTED\nx = 1\nfoo[bar] = 2\n\nw = 3\n\n#~# ORIGINAL\n\nx = 1; x = 2\n xyz = 2\n\n w = 3\n\n#~# EXPECTED\nx = 1; x = 2\nxyz = 2\n\nw = 3\n\n#~# ORIGINAL\n\na = begin\n b = 1\n abc = 2\n end\n\n#~# EXPECTED\na = begin\n b = 1\n abc = 2\n end\n\n#~# ORIGINAL\n\na = 1\n a += 2\n\n#~# EXPECTED\na = 1\na += 2\n\n#~# ORIGINAL\n\nfoo = 1\n a += 2\n\n#~# EXPECTED\nfoo = 1\na += 2\n\n#~# ORIGINAL\n\nx = 1\n xyz = 2\n\n w = 3\n\n#~# EXPECTED\nx = 1\nxyz = 2\n\nw = 3" +"# 2.1.3+2\n* Prepare for upcoming change to File.openRead()\n\n# 2.1.3+1\n* Apply control flow lints.\n\n# 2.1.3\n* Apply lints.\n* Pin to Dart `>=2.0.0 <3.0.0`.\n* Use at least version `2.0.0-rc.0` of `angel_framework`.\n\n# 2.1.2+1\n* Fix a typo that prevented `Range` requests from working.\n\n# 2.1.2\n* Patch support for range+streaming in Caching server.\n\n# 2.1.1\n* URI-encode paths in directory listing. This produces correct URL's, always.\n\n# 2.1.0\n* Include support for the `Range` header.\n* Use MD5 for etags, instead of a weak ETag.\n\n# 2.0.2\n* Fixed invalid HTML for directory listings.\n\n# 2.0.1\n* Remove use of `sendFile`.\n* Add a `p.isWithin` check to ensure that paths do not escape the `source` directory.\n* Handle `HEAD` requests.\n\n# 2.0.0\n* Upgrade dependencies to Angel 2 + file@5.\n* Replace `useStream` with `useBuffer`.\n* Remove `package:intl`, just use `HttpDate` instead.\n\n# 1.3.0+1\n* Dart 2 fixes.\n* Enable optionally writing responses to the buffer instead of streaming.\n\n# 1.3.0\n* `pushState` uses `strict` mode when `accepts` is passed.\n\n# 1.3.0-alpha+2\n* Added an `accepts` option to `pushState`.\n* Added optional directory listings.\n\n# 1.3.0-alpha+1\n* ETags once again only encode the first 50 bytes" +"# Streaming at Scale integration tests\n\n## Integration test suite\n\nThe suite is designed to run in a fully automated manner in Azure DevOps. It\ndeploys each scenario from the streaming-at-scale project in a new resource\ngroup, then runs a verification job in Azure Databricks that reads the entire\nset of events generated by the scenario, and tests assertions on the throughput\nand latency. For instance, after running the eventhubs-functions-cosmosdb\nscenario at 5000 events/second, the verification job downloads the output data\nstored in CosmosDB and computes the end-to-end latency and throughput\naggregated at one-minute interval. To account for some runtime variability, the\ntest asserts that a throughput of at least 90% of the expected throughput (in\nthat case, 270,000 events in a one-minute interval) was reached at least once.\n\nSince the provisioning of an Azure Databricks workspace cannot be fully\nautomated at present, you must generate a PAT token of a preexisting workspace\nand supply it to the pipeline.\n\n## Installing the build agent\n\nAs the integration tests can run for more than 6 hours, they must be run on self-hosted Azure DevOps agents.\n\nCreate a project in Azure DevOps. Create an agent pool named \"streaming-at-scale\".\n\nIn the Azure portal," +"/*\nModule : AADATE.CPP\nPurpose: Implementation for the algorithms which convert between the Gregorian and Julian calendars and the Julian Day\nCreated: PJN / 29-12-2003\nHistory: PJN / 10-11-2004 1. Fix for CAADate::Get so that it works correctly for propalactive calendar dates\n PJN / 15-05-2005 1. Fix for CAADate::Set(double JD, bool bGregorianCalendarCalendar) not setting the m_bGregorianCalendarCalendar\n member variable correctly. \n PJN / 26-01-2006 1. After a bug report from Ing. Taras Kapuszczak that a round trip of the date 25 January 100 as \n specified in the Gregorian calendar to the Julian day number and then back again produces the\n incorrect date 26 January 100, I've spent some time looking into the 2 key Meeus Julian Day\n algorithms. It seems that the algorithms which converts from a Calendar date to JD works ok for\n propalactive dates, but the reverse algorithm which converts from a JD to a Calendar date does not.\n Since I made the change in behaviour to support propalactive Gregorian dates to address issues\n with the Moslem calendar (and since then I have discovered further unresolved bugs in the Moslem\n calendar algorithms and advised people to check out my AA+ library instead), I am now reverting\n these changes so that" +"# Testing Magento 2\n\n**Warden** is the first Development Environment that has testing in the blood.\n\nTo enable testing components, set the following configuration in your project's `.env` file:\n\n```\nWARDEN_TEST_DB=1\n```\n\nThis will launch an additional MySQL 5.7 database instance running on `tempfs` (blazing-fast memory) storage.\n\nTemporary file locations have also been mounted into `tempfs` memory stoage:\n\n- `/var/www/html/dev/tests/integration/tmp/`\n\n## Running Unit Tests\n\nCreate your `phpunit.xml` using contents of `phpunit.xml.dist`. We recommend customizing values of:\n\n- Memory usage `TESTS_MEM_USAGE_LIMIT` with value of `8G` should be enough\n\n ```xml\n \n \n \n ```\n \nThat's it! Now you are ready to run Unit Tests.\n\n### Execution\n\n- To run all tests declared in `phpunit.xml` execute:
`vendor/bin/phpunit -c dev/tests/unit/phpunit.xml`\n- If you need to run only specific directory, execute:
`vendor/bin/phpunit -c dev/tests/unit/phpunit.xml {PATH TO TESTS}` \n\n### Debugging\n\nIf you have [configured Xdebug](xdebug.md), run Unit tests inside **Debug** console (`warden debug` instead of `warden shell`). The code execution will stop at the breakpoints.\n\n## Running Javascript Unit Tests\n\n1. Configure your `.env` and set `NODE_VERSION=10`\n2. Launch a shell session within the project environment's `php-fpm` container with `warden shell`\n3. Install javascript unit test dependencies with `npm install`\n4. Deploy static content" +"#pragma once\n\n#include \"BaseTheme.hpp\"\n#include \"common/Singleton.hpp\"\n#include \"util/RapidJsonSerializeQString.hpp\"\n\n#include \n#include \n#include \n#include \n\nnamespace chatterino {\n\nclass WindowManager;\n\nclass Theme final : public Singleton, public BaseTheme\n{\npublic:\n Theme();\n\n /// SPLITS\n struct {\n QColor messageSeperator;\n QColor background;\n QColor dropPreview;\n QColor dropPreviewBorder;\n QColor dropTargetRect;\n QColor dropTargetRectBorder;\n QColor resizeHandle;\n QColor resizeHandleBackground;\n\n struct {\n QColor border;\n QColor background;\n QColor text;\n QColor focusedText;\n // int margin;\n } header;\n\n struct {\n QColor border;\n QColor background;\n QColor selection;\n QColor focusedLine;\n QColor text;\n QString styleSheet;\n // int margin;\n } input;\n } splits;\n\n void normalizeColor(QColor &color);\n\nprivate:\n void actuallyUpdate(double hue, double multiplier) override;\n void fillLookupTableValues(double (&array)[360], double from, double to,\n double fromValue, double toValue);\n\n double middleLookupTable_[360] = {};\n double minLookupTable_[360] = {};\n\n pajlada::Signals::NoArgSignal repaintVisibleChatWidgets_;\n\n friend class WindowManager;\n};\n\n} // namespace chatterino" +"/*++ BUILD Version: 0001 // Increment this if a change has global effects\n\nCopyright (c) 1993 ACER America Corporation\n\nModule Name:\n\n acer.h\n\nAbstract:\n\n This header file defines the unique interfaces, defines and structures\n for the ACER product line\n\nRevision History:\n 1.0b - plm initial release\n 1.1b - acer.c: halpacereisa: handle scrabled eisa data gracefully.\n\n--*/\n\n#define ACER_HAL_VERSION_NUMBER \"Acer HAL Version 1.1b for October Windows NT Beta.\\n\"\n\n\n/* ACER Special I/O Port defintions\n * I/O Port Address 0xcc4h\n *\t\t\t|\n *\t\t\t0: cpu0 & cpu1\n *\t\t\tc: cpu2 & cpu3\n *\n * bits < 7 6 5 4 3 2 1 0 > (WRITE-ONLY)\n *\t 0 0 |\t0 0 | 0\t|\n *\t |\t |\tBIOS Shadow Control\n *\t |\t |\t0: ROM BIOS\n *\t |\t |\t1: RAM BIOS\n *\t |\t |\n *\t |\t |\n *\t |\t Write-Back Cache Control\n *\t |\t 0: write-thru ( write-back disabled)\n *\t |\t 1: write-back enabled\n *\t |\n *\t 15Mb to 16Mb Memory Setup\n *\t 0: Ram\n *\t 1: EISA\n *\n */\n\n\n// where do i find the CSR which controls the write-back enabling?\n#define\tACER_PORT_CPU01\t 0xcc4\t // write only - setup reg. cpu 0,1\n#define\tACER_PORT_CPU23\t 0xccc4\t // write onlY - setup" +"Namespaces\n========================\n\nWhen generating with a target language of \"scala\" (the default), scrooge will\nfirst look for and use a \"scala\" namespace declaration. If no scala namespace\nis specified, scrooge will then fall back to any \"java\" namespace declaration,\nand lastly to the catch-all \"*\" namespace.\n\nApache's thrift generator, prior to version 0.9, will actually abort with a\nparse error if it encounters a \"scala\" namespace declaration (or for any\nlanguage but a few known languages that it supports). This means that adding a\n\"scala\" namespace declaration to your thrift file, while optimal for scrooge\nusers, will render your thrift file unparseable by many versions of apache's\nthrift generator.\n\nTo allow for backwards-compatibility with Apache's thrift generator, scrooge\nsupports a special comment-based namespace declaration that Apache's thrift\ngenerator will ignore. To use this syntax, use the line comment character '#',\nimmediately followed by '@'. Apache's thrift generator will treat the\nremaining text as a line comment, but scrooge will interprent any following\nnamespace declaration as if not commented-out. For example:\n\n::\n\n namespace java com.twitter.tweetservice.thriftjava\n #@namespace scala com.twitter.tweetservice.thriftscala" +"---\ntitle: Travis CI for R?\ndate: '2013-04-07'\nslug: travis-ci-for-r\n---\n\nI'm always worried about [CRAN](http://cran.r-project.org): a system maintained by FTP and emails from real humans (basically one of Uwe, Kurt or Prof Ripley). I'm worried for two reasons:\n\n1. The number of R packages is growing _exponentially_;\n2. Time and time again I see frustrations from both parties (CRAN maintainers and package authors);\n\nI have a good solution for 2, which is to keep silent when your submission passes the check system, and say \"Sorry!\" no matter if you agree with the reason or not when it did not pass (which made one maintainer unhappy), but do not argue -- just go back and fix the problem if you know what is the problem; or use dark voodoo to hide (yes, _hide_, not solve) the problem if you are sure you are right. If you read the mailing list frequently, you probably remember that `if (CRAN)` discussion. The solution in my mind was `if (Sys.getenv('USER') == 'ripley')`.\n\nThe key is, do not argue. Silence is gold.\n\n![You shall not pass](https://db.yihui.org/imgur/3mdv0k9.jpg)\n\nThe CRAN maintainers have been volunteering their time, and we should respect them. The question is, will this approach" +"---\n.. title: Mask an image with a shape\n.. summary: How to mask an image with a shape\n.. type: howto\n---\n\nSince 0.9.0 openFrameworks has had the ability to set the alpha mask for a texture with another texture. In this example, we draw a path into an FBO (offscreen image) and then pass the result to the image we want to mask.\n\n```\nofPath path;\nofImage img;\nofFbo fbo;\n\n\nvoid setup(){\n path.lineTo(...);\n // draw on the path, level of gray means alpha\n\n fbo.allocate(w,h,GL_LUMINANCE); //or GL_RED if you are using the programmable renderer\n fbo.begin();\n path.draw();\n fbo.end();\n\n img.getTexture().setAlphaMask(fbo.getTexture());\n}\n\nvoid draw(){\n img.draw();\n}\n```" +"# absolute root path of your azerothcore repository\n# It should not be modified if you don't really know what you're doing\nSRCPATH=\"$AC_PATH_ROOT\"\n\n# absolute path where build files must be stored\nBUILDPATH=\"$AC_PATH_ROOT/var/build/obj\"\n\n# absolute path where binary files must be stored\nBINPATH=\"$AC_PATH_ROOT/env/dist\"\n\n# bash fills it by default with your os type. No need to change it.\n# Change it if you really know what you're doing.\n# OSTYPE=\"\"\n\n# When using linux, our installer automatically get information about your distro\n# using lsb_release. If your distro is not supported but it's based on ubuntu or debian,\n# please change it to one of these values.\n#OSDISTRO=\"ubuntu\"\n\n# absolute path where config. files must be stored\n# default: the system will use binpath by default\n# CONFDIR=\"$AC_PATH_ROOT/env/dist/etc/\"\n\n##############################################\n#\n# COMPILER_CONFIGURATIONS\n#\n##############################################\n\n\n# Set preferred compilers.\n# To use gcc (not suggested) instead of clang change in:\n# CCOMPILERC=\"/usr/bin/gcc\"\n# CCOMPILERCXX=\"/usr/bin/g++\"\n#\nCCOMPILERC=\"/usr/bin/clang\"\nCCOMPILERCXX=\"/usr/bin/clang++\"\n\n\n# how many thread must be used for compilation ( leave zero to use all available )\nMTHREADS=0\n# enable/disable warnings during compilation\nCWARNINGS=ON\n# enable/disable some debug informations ( it's not a debug compilation )\nCDEBUG=OFF\n# specify compilation type\nCTYPE=Release\n# compile" +"The components in this area implement several file decompression\nmechanisms. They provide real-time decompression to permit inspection\n(rules) of the decompressed content.\n\nIn particular the components support these decompression options:\n\n1. Decompress SWF (Adobe Flash) files compressed with the ZLIB algorithm\n\n2. Optionally decompress SWF files compressed with the LZMA algorithm.\n This is only available if Snort ++ is built with the optional LZMA\n support.\n\n3. Decompress the Deflate compressed portions of PDF files.\n\nThe three modes are individually enabled/disabled at initialization time.\n\nAll parsing and decompression is incremental and allows inspection to\nproceed as the file is received and processed.\n\nSWF File Processing:\n\nSWF files exist in three forms: 1) uncompressed, 2) ZLIB compressed, and 3)\nLZMA compressed. SWF files begin with a file signature block (always\nuncompressed) to indicate the format of the balance of the file. The\nbalance of the file is formatted and processed as specified.\n\nPDF files are significantly more complex as the compressed content is\nembedded within the PDF syntax and one file may contain one or many\ncompressed segments.\n\nThus the PDF decompression engine implements a lightweight PDF file parser\nto locate the PDF Stream segments and then attempt to decompress Streams\nthat" +"---\n3ip: 9\ntitle: Ghost Threads: Ephemeral Threads on 3box\nauthor: Ghilia Weldesselasie @ghiliweld\ndiscussions-to: https://github.com/3box/3box/issues/675\nstatus: Draft\ncreated: 2019-08-17\nrequires: 3IP 1\n---\n\n## Simple Summary\nEphemeral messages on 3box, for when we don't want messages to persist (like in Persistent Threads).\n\n## Abstract\nCurrently 3Box supports building chat like system though the threads api. However, for building things like trollboxes, or online lists, etc. threads are not the best fit. Ipfs has a feature called pubsub, which is a system for passing messages between peers. Pubsub can be used to create an ephemeral chat system where messages are not persisted anywhere in the network, rather they are just sent across once and seen by anyone online at that point. \n\n## Motivation\nOne could create Threads and delete message we don't want persisted, but if we already know we don't want to persist any messages in a chat it makes more sense to simply use pubsub and not save messages from the get-go. Which is why we're working on this new standard for ephemeral messaging.\n\n## Specification\n\n### Room Discovery\n\nEach pubsub room on ipfs has a name that serves as a topic between peers. To distinguish between different" +"declare module jCafe {\n \n export interface ParticipantAudio {\n /** Connection state of this participant's audio */\n state: Property;\n\n /** Set this property to mute/unmute this participant (local or remote) */\n isMuted: Property;\n\n /** \n * For the local participant set this property to hold/resume the call;\n * for a remote participant this property is read-only and is true when\n * the remote participant is holding the call. \n */\n isOnHold: Property;\n\n /** True if there is audible sound in the channel */\n isSpeaking: Property;\n \n /**\n * Defines the endpoint where the participant is being reached\n * To reach participant on Skype network and have Skype to Skype audio, \n * set it to corresponding person.id - this is default value\n * To reach participant on phone network and have Skype to PSTN audiocall, \n * set it to corresponding person.phoneNumbers[].telUri\n */\n endpoint: Property;\n }\n}" +"const int five = 5;\nint array[five];\n\nclass A\n {\n public:\n enum values { zero, nonzero };\n };\n\ntypedef int PRInt32;\nconst PRInt32 kDefaultStringSize = 64;\nenum eCharSize {eOneByte=0,eTwoByte=1};\nchar mBuffer_1[kDefaultStringSize << eTwoByte];\n\nchar mBuffer_2[kDefaultStringSize << A::zero];\n\n// Note that \"A::nonzero\" will be unparsed as unparsed as \"eTwoByte\"\n// because the types are equivalent. Not sure where this should be \n// fixed or if it might be an EDG issue. I expect that our name\n// mangling is using values for enums rather than names. This is\n// not a critical issue but should be fixed at some point, the \n// resulting code is still correct, but transformations would be\n// an issue.\nchar mBuffer_3[kDefaultStringSize << A::nonzero];" +"from django.db import models\n\n\nclass FederalAccount(models.Model):\n \"\"\"\n Represents a single federal account. A federal account encompasses multiple Treasury Account Symbols (TAS),\n represented by: model:`accounts.TreasuryAppropriationAccount`.\n \"\"\"\n\n agency_identifier = models.TextField(db_index=True)\n main_account_code = models.TextField(db_index=True)\n account_title = models.TextField()\n federal_account_code = models.TextField(null=True) # agency_identifier + '-' + main_account_code\n parent_toptier_agency = models.ForeignKey(\n \"references.ToptierAgency\",\n models.DO_NOTHING,\n null=True,\n help_text=(\n \"The toptier agency under which this federal account should appear in lists and dropdowns. Not \"\n \"as simple as just mapping the AID to an agency, although AID does factor into the decision.\"\n ),\n )\n\n class Meta:\n managed = True\n db_table = \"federal_account\"\n unique_together = (\"agency_identifier\", \"main_account_code\")\n\n @staticmethod\n def fa_rendering_label_to_component_dictionary(fa_rendering_label) -> dict:\n return {\"faaid\": fa_rendering_label.split(\"-\")[0], \"famain\": fa_rendering_label.split(\"-\")[1]}" +"#N canvas 94 39 629 576 10;\n#X text 430 514 (c)2011 \\, Marian Weger;\n#X obj 287 127 spigot 1;\n#X msg 287 179 \\$3 \\$2 \\$1;\n#X obj 287 80 r /midi/\\$1/in;\n#X obj 37 142 spigot 1;\n#X obj 37 80 r \\$2;\n#X obj 37 360 + 0.5;\n#X obj 37 380 int;\n#X obj 287 484 t b a b;\n#X msg 326 510 0;\n#X obj 306 537 s \\$2;\n#X msg 287 510 1;\n#X obj 37 188 spigot 1;\n#X obj 287 324 * 1;\n#X obj 287 275 / 127;\n#X obj 37 306 / 1;\n#X obj 37 328 * 127;\n#X obj 155 80 inlet out-state;\n#X obj 400 80 inlet in-state;\n#X obj 37 254 - \\$7;\n#X obj 52 282 r \\$0-scaling;\n#X obj 37 433 list append \\$4;\n#X obj 287 368 kdemux2 \\$8;\n#X obj 344 395 pack f \\$8;\n#X obj 344 423 line 0 \\$9;\n#X obj 287 253 route \\$4;\n#X obj 302 299 r \\$0-scaling;\n#X obj 287 157 route \\$3;\n#X obj 37 401 list prepend \\$3;\n#X obj 287 346 + \\$7;\n#X text 33 14 midi_bi: " +"#!/bin/sh\nexec perl -x $0 $*; echo \"Could not exec perl!\"; exit 1\n# The line above allows perl to be anywhere, as long as it's in your\n# PATH environment variable.\n\n#!perl\n#\n# fix-titles.pl\n# $Id$\n#\n# The HTML stylesheet likes to print html has the ends of tags on a different\n# line, like this:\n# FreeBSD\n#\n# Glimpse, which is indexing our website, finds this very confusing and\n# it cannot pick out the title from this mess. This script takes a list\n# of HTML files on the command line and attempts to make the tag\n# look more normal so that glimpse can understand it.\n#\n# WARNING: This is a hack. It's made to work on docbook generated html, but\n# may do strange things on anything else.\n\nuse strict;\n\nforeach my $file (@ARGV) {\n print \"Fixing $file\\n\";\n rename $file, \"$file.orig\";\n open (IN, \"$file.orig\") || die \"open $file.orig\";\n open (OUT, \">$file\") || die \"open $file for writing\";\n while (<IN>) {\n if (/^<HTML$/) {\n print OUT \"<HTML>\\n\";\n } elsif (/^><HEAD$/) {\n print OUT \"<HEAD>\\n\";\n } elsif (/^><TITLE$/) {\n print OUT \"<TITLE>\";\n } elsif" +"require \"test_helper\"\nrequire \"minitest/mock\"\n\nclass TraceImporterJobTest < ActiveJob::TestCase\n def test_success_notification\n # Check that the user gets a success notification when the trace has valid points\n trace = create(:trace)\n\n gpx = Minitest::Mock.new\n def gpx.actual_points\n 5\n end\n\n trace.stub(:import, gpx) do\n TraceImporterJob.perform_now(trace)\n end\n\n email = ActionMailer::Base.deliveries.last\n assert_equal trace.user.email, email.to[0]\n assert_match(/success/, email.subject)\n\n ActionMailer::Base.deliveries.clear\n end\n\n def test_failure_notification\n # Check that the user gets a failure notification when the trace has no valid points\n trace = create(:trace)\n\n gpx = Minitest::Mock.new\n def gpx.actual_points\n 0\n end\n\n trace.stub(:import, gpx) do\n TraceImporterJob.perform_now(trace)\n end\n\n email = ActionMailer::Base.deliveries.last\n assert_equal trace.user.email, email.to[0]\n assert_match(/failure/, email.subject)\n\n ActionMailer::Base.deliveries.clear\n end\n\n def test_error_notification\n # Check that the user gets a failure notification when something goes badly wrong\n trace = create(:trace)\n trace.stub(:import, -> { raise }) do\n TraceImporterJob.perform_now(trace)\n end\n\n email = ActionMailer::Base.deliveries.last\n assert_equal trace.user.email, email.to[0]\n assert_match(/failure/, email.subject)\n\n ActionMailer::Base.deliveries.clear\n end\nend" +"Hierarchical deterministic Wallets (bip32)\n\nJames Chiang\n\n<https://twitter.com/kanzure/status/1047714436889235456>\n\nvideo: <https://www.youtube.com/watch?v=OVvue2dXkJo>\n\n<https://teachbitcoin.io/presentations/wallets.html>\n\n# Introduction\n\nMy name is James Chiang. Quick introduction about myself. My contributions to bitcoin have been on the project libbitcoin where I do documentation. I've been working through the APIs and libraries. I think it's a great toolkit. Eric Voskuil is speaking later today and he also works on libbitcoin. I also volunteered to talk about hierarchical deterministic wallets.\n\nWhen we talk about bitcoin wallets, you always have some kind of secret or entropy you want to keep safe. One way to handle it and store it more easily is to use word lists. Obviously, you want to derive new fresh keys whenever you transact, so that's child key derivation. Also, there's a tree structure for standard recovery of the keys.\n\n# bip39: Mnemonic keywords\n\n<https://github.com/bitcoin/bips/blob/master/bip-0039.mediawiki>\n\nWhat we're doing here is we have some kind of entropy to encode words. So here we have 128 bits of entropy. But you can have any multiple of 32 bits all the way up to 256 bits. When we encode these words, we want to make sure we have this checksum part. There's a 4-bit checksum, first few bits of the entropy" +"package internal\n\nimport (\n\t\"github.com/johnfercher/maroto/pkg/props\"\n\t\"github.com/jung-kurt/gofpdf\"\n)\n\n// Signature is the abstraction which deals of how to add a signature space inside PDF\ntype Signature interface {\n\tAddSpaceFor(label string, cell Cell, textProp props.Text)\n}\n\ntype signature struct {\n\tpdf gofpdf.Pdf\n\tmath Math\n\ttext Text\n}\n\n// NewSignature create a Signature\nfunc NewSignature(pdf gofpdf.Pdf, math Math, text Text) *signature {\n\treturn &signature{\n\t\tpdf,\n\t\tmath,\n\t\ttext,\n\t}\n}\n\n// AddSpaceFor create a space for a signature inside a cell\nfunc (s *signature) AddSpaceFor(label string, cell Cell, textProp props.Text) {\n\tleft, top, _, _ := s.pdf.GetMargins()\n\tspace := 4.0\n\n\tlineCenterY := cell.Height / 1.33\n\tcell.Y += lineCenterY\n\n\ts.pdf.Line(cell.X+left+space, cell.Y+top, cell.X+cell.Width+left-space, cell.Y+top)\n\n\tcell.Y += 2.0\n\ts.text.Add(label, cell, textProp)\n}" +"StartChar: uni0311\nEncoding: 441 785 324\nWidth: 0\nFlags: MW\nAnchorPoint: \"top_accent\" 158 570 mark 0\nLayerCount: 2\nFore\nSplineSet\n194 519 m 0\n 191 542 188 579 152 579 c 0\n 111 579 83 560 71 526 c 0\n 67 515 56 498 46 498 c 0\n 36 498 32 508 32 516 c 0\n 32 577 95 636 169 636 c 0\n 201 636 241 607 241 566 c 0\n 241 533 228 498 208 498 c 0\n 201 498 195 511 194 519 c 0\nEndSplineSet\nValidated: 1\nSubstitution2: \"Smallcaps subtable\" uni0311.sc\nSubstitution2: \"'case' Case-Sensitive Forms lookup 6 subtable\" uni0311.cap\nEndChar" +"control-character : Vez\u00e9rl\u0151karakterek\nbasic-latin : Latin (alap)\nlatin-1-supplement : Latin-1 kieg\u00e9sz\u00edt\u0151 karakterek\nlatin-extended-a : B\u0151v\u00edtett Latin (A)\nlatin-extended-b : B\u0151v\u00edtett Latin (B)\nipa-extensions : IPA kiterjeszt\u00e9sek\nspacing-modifier-letters : Fonetikus jelek\ncombining-diacritical-marks : Kombin\u00e1lt diakritikus jelek\ngreek-coptic : G\u00f6r\u00f6g \u00e9s Kopt\ncyrillic : Cirill\ncyrillic-supplement : Cirill kieg\u00e9sz\u00edt\u0151 karakterek\narmenian : \u00d6rm\u00e9ny\nhebrew : H\u00e9ber\narabic : Arab\nsyrian : Sz\u00edr\narabic-supplement : Arab kieg\u00e9sz\u00edt\u0151 karakterek\nthaana : Thaana\nnko : Nko\nsamaritan : Szamarit\u00e1n\nmandaic : Mandaic\narabic-extended-a : Kiterjesztett Arab (A)\ndevanagari : D\u00e9van\u00e1gari\nbengali : Beng\u00e1li\ngurmukhi : Gurmukhi\ngujarati : Gudzsarati\noriya : Orija\ntamil : Tamil\ntelugu : Telugu\nkannada : Kannada\nmalayalam : Malayalam\nsinhala : Sinhala\nthai : Thai\nlao : Lao\ntibetan : Tibeti\nmyanmar : Burmai\ngeorgian : Gr\u00faz\nhangul-jamo : Hangul Jamo\nethiopic : Eti\u00f3p\nethiopic-supplement : Eti\u00f3p kieg\u00e9sz\u00edt\u0151 karakterek\ncherokee : Cseroki\nunified-canadian-aboriginal-syllabics : Egyszer\u0171s\u00edtett kanadai bennsz\u00fcl\u00f6tt jelek\nogham : Ogham\nrunic : Runic\ntagalog : Tagalog\nhanunoo : Hanun\u00f3o\nbuhid : Buhid\ntagbanwa : Tagbanwa\nkhmer : Khmer\nmongolian : Mongol\nunified-canadian-aboriginal-syllabics-extended : Egyszer\u0171s\u00edtett kanadai bennsz\u00fcl\u00f6tt jelek kieg\u00e9sz\u00edt\u00e9se\nlimbu : Limbu\ntai-le : Tai Le\nnew-tai-lue : \u00daj Tai L\u00fc\nkhmer-symbols : Khmer szimb\u00f3lumok\nbuginese : Bugin\u00e9z\ntai-tham :" +"// this cannot be located at source/brower.ts as otherwise it would become the browser entry\n\n// Imports\nimport { Transform } from '../transform.js'\n\n/** Mapping of ANSI color codes into a CSS style */\nexport interface Styles {\n\t[key: string]: {\n\t\t/** The ANSI color code used to start the style */\n\t\tstart: string\n\t\t/** The ANSI color code used to end the style */\n\t\tend: string\n\t\t/** The CSS style value */\n\t\tvalue: string\n\t}\n}\n\n/** Configuration optons for the Caterpillar Browser Transform */\nexport interface BrowserOptions {\n\t/** Use to override the default value of {@link Filter.color} */\n\tcolor?: boolean\n\t/** Use to override the default value of {@link Filter.console} */\n\toutput?: boolean\n\t/** Use to override the default value of {@link Filter.styles} */\n\tstyles?: Styles\n}\n\n/**\n * Convert human readable Caterpillar entries into browser compatible entries.\n * @example\n * ``` javascript\n * import { Logger, Human, Browser } from 'caterpillar'\n * const logger = new Logger()\n * logger.pipe(new Human()).pipe(new Browser())\n * logger.log('info', 'some', {data: 'oh yeah'}, 42)\n * ```\n */\nexport class Browser extends Transform {\n\t/** Whether or not we should use color. */\n\tpublic color: boolean = true\n\n\t/** Whether or not we" +"package com.aricneto.twistytimer.stats;\n\nimport android.util.Log;\n\nimport com.aricneto.twistytimer.items.AverageComponent;\nimport com.aricneto.twistytimer.utils.PuzzleUtils;\nimport com.aricneto.twistytimer.utils.StatUtils;\n\nimport java.util.ArrayList;\nimport java.util.Arrays;\nimport java.util.Collections;\nimport java.util.List;\n\n/**\n * Calculates the average time of a number of puzzle solves. Running averages are easily calculated\n * as each new solve is added. If the number of solve times is five or greater, the best and worst\n * times are discarded before returning the truncated arithmetic mean (aka \"trimmed mean\" or\n * \"modified mean) of the remaining times. All times and averages are in milliseconds. The mean,\n * minimum (best) and maximum (worst) of all added times, and the best average from all values of\n * the running average are also made available.\n *\n * @author damo\n */\npublic final class AverageCalculator {\n // NOTE: This implementation is reasonably efficient, as it can calculate an average without\n // iterating over the array of recorded times. Iteration is only required when the known best\n // or worst values are ejected to make room for new times. Storing times in sorted order would\n // reduce this explicit iteration, but a second data structure would be required to record the\n // insertion order and the insertion operations would be more costly, as the sort" +"``django-google-maps`` is a simple application that provides the basic\nhooks into google maps V3 api for use in Django models from Django\nversion 1.11+.\n\nStarting with ``django-google-maps`` version (0.7.0), Django 1.11+ is\nrequired because Django changed their widget template rendering system.\nVersion 0.8.0 supports Django 2.0+, and as such removes support for\nPython 2.7\n\nI\u2019m using this to allow someone from the admin panels to type a freeform\naddress, have the address geocoded on change and plotted on the map. If\nthe location is not 100% correct, the user can drag the marker to the\ncorrect spot and the geo coordinates will update.\n\nStatus\n~~~~~~\n\n|Build Status|\n\nUSAGE:\n------\n\n- include the ``django_google_maps`` app in your ``settings.py``\n\n- Add your Google Maps API Key in your ``settings.py`` as\n ``GOOGLE_MAPS_API_KEY``\n\n- create a model that has both an address field and geolocation field\n\n .. code:: python\n\n from django.db import models\n from django_google_maps import fields as map_fields\n\n class Rental(models.Model):\n address = map_fields.AddressField(max_length=200)\n geolocation = map_fields.GeoLocationField(max_length=100)\n\n- in the ``admin.py`` include the following as a formfield_override\n\n .. code:: python\n\n from django.contrib import admin\n from django_google_maps import widgets as map_widgets\n from django_google_maps import fields as map_fields\n\n class RentalAdmin(admin.ModelAdmin):\n formfield_overrides = {\n map_fields.AddressField: {'widget': map_widgets.GoogleMapsAddressWidget}," +"#!/usr/bin/env bash\nset -evx\n\nif [ \"$TARGET\" = \"Scala.JS\" ]; then\n sbt scala212 \\\n js/compile \\\n js/test:compile \\\n coreJS/fastOptJS \\\n cacheJS/fastOptJS \\\n testsJS/test:fastOptJS \\\n js/test:fastOptJS \\\n js/test\nelif [ \"$TARGET\" = \"ScalaNative\" ]; then\n curl -f https://raw.githubusercontent.com/scala-native/scala-native/master/scripts/travis_setup.sh | bash -x\n\n sbt scala212 \\\n launcher-native_03/publishLocal \\\n launcher-native_040M2/publishLocal \\\n cli/pack\n\n modules/cli/target/pack/bin/coursier bootstrap \\\n -S \\\n -o native-echo \\\n io.get-coursier:echo_native0.3_2.11:1.0.1\n\n if [ \"$(./native-echo -n foo a)\" != \"foo a\" ]; then\n echo \"Error: unexpected output from native test bootstrap.\" 1>&2\n exit 1\n fi\nelif [ \"$TARGET\" = \"Website\" ]; then\n amm-runner website.sc generate\nelse\n sbt scalaFromEnv \\\n jvmProjects/compile \\\n jvmProjects/test:compile\n\n if [ \"$(uname)\" == \"Darwin\" ]; then\n IT=\"testsJVM/it:test\" # doesn't run proxy-tests in particular\n else\n IT=\"jvmProjects/it:test\"\n fi\n\n ./scripts/with-redirect-server.sh \\\n ./modules/tests/handmade-metadata/scripts/with-test-repo.sh \\\n sbt scalaFromEnv jvmProjects/test \"$IT\"\n\n sbt scalaFromEnv evictionCheck compatibilityCheck\nfi" +"---\ntitle: How to write a good coding article\nlayout: post\nslug: writing-good-coding-articles\ntags:\n - teaching\n - writing\nnewsletter: better-fed\ndescription: A good article shows a student how to think through a problem. The student will go \"oooohhhhh!\" as they read through the article. You can write a good programming article if you watch out for these five factors.\n---\n\nA good article shows a student how to think through a problem. The student will go \"oooohhhhh!\" as they read through the article. They'll understand the concept they're trying to learn, and they'll stop searching the web for the same topic.\n\nLousy articles do the opposite. Students get more confused as they read through the article. They may even wonder if they have what it takes to learn programming.\n\nIt doesn't take much to turn a bad article into a good one. The content can remain the same. You only need to get five factors right.\n\n<!-- more -->\n\n## The five factors\n\nThe five factors are:\n\n1. The purpose of your article\n2. Who's the student\n3. The examples and analogies used\n4. The language used\n5. The ease of reading\n\n## The purpose of your article\n\nImagine you're" +"#' @section Uniqueness:\n#'\n#' By default the tree IDs are numbered from 1 to n, n being the number of trees found. The problem\n#' with such incremental numbering is that, while it ensures a unique ID is assigned for each tree in\n#' a given point-cloud, it also guarantees duplication of tree IDs in different tiles or chunks when\n#' processing a `LAScatalog`. This is because each file is processed independently of the others and potentially\n#' in parallel on different computers. Thus, the index always restarts at 1 on each file or chunk. Worse,\n#' in a tree segmentation process, a tree that is located exactly between 2 files will have two different\n#' IDs for its two halves.\n#'\n#' This is why we introduced some uniqueness strategies that are all imperfect and that should be seen\n#' as experimental. Please report any troubleshooting. Using a uniqueness-safe strategy ensures that\n#' trees from different files will not share the same IDs. Moreover, it also means that two halves of a tree\n#' on the edge of a processing chunk will be assigned the same ID.\n#'\n#' \\describe{\n#' \\item{incremental}{Number from 0 to n. This method" +"c This Formular is generated by mcnf\nc\nc horn? no \nc forced? no \nc mixed sat? no \nc clause length = 3 \nc\np cnf 50 218 \n 45 -29 8 0\n-49 17 11 0\n-40 -17 -23 0\n-1 49 20 0\n-10 -18 40 0\n-25 -24 48 0\n44 27 41 0\n-17 -43 30 0\n32 33 11 0\n24 26 42 0\n-33 38 44 0\n-5 14 49 0\n-8 33 -45 0\n49 3 -10 0\n34 -27 2 0\n-45 23 3 0\n10 7 36 0\n-10 36 -5 0\n46 8 42 0\n-38 -19 12 0\n50 37 45 0\n8 -46 12 0\n19 -33 26 0\n11 47 48 0\n3 45 34 0\n21 22 4 0\n-18 -31 8 0\n-1 9 -38 0\n-32 -35 -23 0\n-45 2 -24 0\n-45 36 -17 0\n11 -18 20 0\n23 48 -18 0\n37 -5 26 0\n38 29 -26 0\n8 -1 -27 0\n-14 -9 -45 0\n42 50 39 0\n-1 47 17 0\n-50 43 -29 0\n-17 -11 -36 0\n20 -48 43 0\n-6 -22 -19 0" +"#include \"decode.h\"\n\n#define MAX_OGG_PAGE_LEN 100000\n\n\ntypedef struct {\n FILE *fp;\n uint8_t *page_buf;\n int page_len;\n int consumed;\n int raw_opus;\n uint8_t version;\n uint8_t header_type;\n uint8_t seg_length;\n uint8_t page_segs;\n uint64_t granule_pos;\n uint32_t bs_serial;\n uint32_t page_sn;\n uint32_t checksum;\n uint8_t seg_len_table[255];\n uint8_t current_segment;\n uint8_t packets_in_page;\n}opus_obj_t;\n\ntypedef struct _nalu_item_t {\n\tuint8_t *buf;\n\tint len;\n\tstruct slice_header_t slice;\n\tstruct nalu_t nalu;\n}nalu_item_t;\n\ntypedef struct {\n FILE *fp;\n h264_dec_obj_t *h264_handle;\n nalu_item_t *curr_nalu;\n nalu_item_t *next_nalu;\n}h264_obj_t;\n\n\nint init_video(h264_obj_t *handle, const char *video_file);\nint reset_video(h264_obj_t *handle);\nint get_video_frame(h264_obj_t *handle, uint8_t *buf, uint32_t *length, int *end_of_frame);\nint init_audio(opus_obj_t *handle, const char *audio_file, int raw_opus);\nvoid close_audio(opus_obj_t *handle);\nint reset_audio(opus_obj_t *handle);\nint get_audio_packet(opus_obj_t *handle, uint8_t *buf, uint32_t *length);" +"# How to use RollingFileOutputter\n\n$: << \"../lib\"\nrequire 'log4r'\ninclude Log4r\n\nputs \"this will take a while\"\n\n# example of log file being split by time constraint 'maxtime'\nconfig = {\n \"filename\" => \"logs/TestTime.log\",\n \"maxtime\" => 10,\n \"trunc\" => true\n}\ntimeLog = Logger.new 'WbExplorer'\ntimeLog.outputters = RollingFileOutputter.new(\"WbExplorer\", config)\ntimeLog.level = DEBUG\n\n# log something once a second for 100 seconds\n100.times { |t|\n timeLog.info \"blah #{t}\"\n sleep(1.0)\n}\n\n# example of log file being split by space constraint 'maxsize'\nconfig = {\n \"filename\" => \"logs/TestSize.log\",\n \"maxsize\" => 16000,\n \"trunc\" => true\n}\nsizeLog = Logger.new 'WbExplorer'\nsizeLog.outputters = RollingFileOutputter.new(\"WbExplorer\", config)\nsizeLog.level = DEBUG\n\n# log a large number of times\n100000.times { |t|\n sizeLog.info \"blah #{t}\"\n}\n\nputs \"done! check the two sets of log files in logs/ (TestTime and TestSize)\"" +"// IM has the following fastpaths:\n// - constant index (constant)\n// - need negative int check (neg)\n// - needs hole check (hole)\n// So to test everything we have to do:\n// constant | neg | hole\n// test 1: 0 0 0\n// test 2: 1 0 0\n// test 3: 0 1 0\n// test 4: 1 1 0\n// test 5: 0 0 1\n// test 6: 1 0 1\n// test 7: 0 1 1\n// test 8: 1 1 1\n\nfunction test1(index, a) {\n if (index < 0)\n index = -index\n return index in a;\n}\nassertEq(test1(1, [1,2]), true);\n\nfunction test2(a) {\n return 0 in a;\n}\nassertEq(test2([1,2]), true);\n\nfunction test3(index, a) {\n return index in a;\n}\n\nvar arr3 = [];\narr3[\"-1073741828\"] = 17;\nassertEq(test3(-1073741828, arr3), true);\n\nfunction test4(a) {\n return -1073741828 in a;\n}\nassertEq(test4(arr3), true);\n\n\nfunction test5(index, a) {\n if (index < 0)\n index = -index\n return index in a;\n}\nvar arr5 = [];\narr5[0] = 1\narr5[1] = 1\narr5[2] = 1\narr5[4] = 1\nassertEq(test5(1, arr5), true);\nassertEq(test5(3, arr5), false);\n\nfunction test7a(a) {\n return 3 in a;\n}\nfunction test7b(a) {\n return 4 in a;\n}\nassertEq(test7a(arr5)," +"Canadian media baron Conrad Black said on Wednesday he wanted to increase his stake in Australia's oldest newspaper group, John Fairfax Holdings Ltd, to 50 percent from 25, a day after meeting the prime minister.\nHowever, Black repeated that he would sell out of Fairfax if there was no easing in Australia's current media ownership rules, which he described as \"anachronistic\".\nBlack said that although he could not rule out selling his stake to Kerry Packer, the suggested merger plan by Packer between his media empire and Fairfax was unlikely to suceed. \nBlack told shareholders at Fairfax's annual meeting that he had said to John Howard in their first meeting since Howard became prime minister that he would like to raise his stake, which is held through Hollinger International Inc.\nBlack was asked by reporters after Fairfax's meeting just how much he would want to increase his stake to, if allowed. \"Fifty percent,\" he said.\nThe media baron added that U.S. accounting rules effectively punished companies which had higher than 50 percent stakes in their associates. However, Black repeated that he would sell his Fairfax stake if he was not allowed to raise it further. \n\"If the road is truly" +"16\n2\n52\n0\n8\n0\n79\n3\n5\n2\n26\n0\n24\n13\n36\n13\n46\n2\n0\n0\n45\n56\n25\n0\n49\n14\n9\n45\n62\n3\n15\n71\n41\n44\n56\n23\n0\n0\n0\n0\n0\n4\n2\n2\n34\n45\n0\n42\n0\n44\n79\n3\n2\n5\n90\n0\n3\n14\n0\n2\n26\n1\n14\n69\n28\n91\n0\n47\n0\n57\n14\n0\n41\n52\n12\n18\n57\n0\n75\n4\n80\n38\n18\n0\n3\n0\n4\n42\n8\n69\n0\n0\n0\n88\n11\n39\n39\n53\n36\n5\n2\n86\n45\n0\n2\n4\n2\n0\n0\n2\n88\n34\n2\n21\n42\n68\n15\n71\n0\n2\n8\n21\n38\n35\n0\n0\n32\n81\n1\n2\n0\n0\n4\n0\n2\n40\n0\n0\n88\n36\n71\n86\n0\n63\n0\n15\n30\n22\n2\n52\n68\n2\n33\n60\n48\n0\n19\n2\n19\n55\n2\n10\n45\n45\n20\n0\n22\n25\n13\n3\n46\n18\n0\n2\n0\n0\n2\n2\n3\n0\n0\n0\n35\n5\n24\n20\n56\n7\n0\n65\n76\n42\n22\n14\n22\n0\n2\n2\n5\n54" +"---\ntitle: Component Manager\n---\n\n# Component Manager\n\nThe Component is a base element of the template. It might be something simple and atomic like an image or a text box, but also complex structures, more probably composed by other components, like sections or pages. The concept of the component was made to allow the developer to bind different behaviors to different elements. For example, opening the Asset Manager on double click of the image is a custom behavior binded to that particular type of element.\n\n::: warning\nThis guide is referring to GrapesJS v0.15.8 or higher\n:::\n\n[[toc]]\n\n\n\n\n\n## How Components work?\n\nLet's see in detail how components work by looking at all the steps from adding an HTML string to the editor.\n\n::: tip\nAll the following snippets can be run directly in console from the [main demo](https://grapesjs.com/demo.html)\n:::\n\nThis is how we can add new components to the canvas:\n\n```js\n// Append components directly to the canvas\neditor.addComponents(`<div>\n <img src=\"https://path/image\" />\n <span title=\"foo\">Hello world!!!</span>\n</div>`);\n\n// or into some, already defined, component.\n// For instance, appending to a selected component would be:\neditor.getSelected().append(`<div>...`);\n\n// Actually, editor.addComponents is an alias of...\neditor.getWrapper().append(`<div>...`);\n```\n\n::: tip\nIf you need" +"# frozen_string_literal: true\n\n# Class builder helps creating anonymous classes that we can use to spec our code\n# We need co create new class instances to have an \"empty\" and \"clear\" class for each spec\n# This module acts as an interface to create classes\nmodule ClassBuilder\n class << self\n # Creates an empty class without any predefined methods\n # @param block [Proc, nil] block that should be evaluated (if given)\n # @return [Class] created anonymous class\n def build(&block)\n klass = Class.new\n\n klass.class_eval(&block) if block_given?\n klass\n end\n\n # This method allows us to create a class that inherits from any other\n # @param klass [Class] any class from which we want to inherit in our anonymous class\n # @param block [Proc] a block of code that should be evaluated in a new anonymous class body\n # @return [Class] new anonymous class\n def inherit(klass, &block)\n Class.new(klass, &block)\n end\n end\nend" +"# data-scientist-roadmap\n\nI just found this data science skills roadmap, drew by [Swami Chandrasekaran](http://nirvacana.com/thoughts/becoming-a-data-scientist/) on his cool blog.\n\n****\n\n![roadmap-picture](http://nirvacana.com/thoughts/wp-content/uploads/2013/07/RoadToDataScientist1.png)\n\n****\n\nJobs linked to __data science__ are becoming __more and more popular__. A __bunch of tutorials__ could easily complete this roadmap, helping whoever wants to __start learning stuff about data science__.\n\nFor the moment, a lot is __got on wikipedia__ (except for codes, always handmade). Any help's thus welcome!\n\n## Rules\n\n* __Feel free to fork this repository and pull requests__.\n* Always comment your code.\n* Please respect topology for filenames.\n* There's one README for each directory.\n* Also, could be great to share useful links or ressources in README files." +".PS\ncct_init(SIdefaults)\nifdef(`m4pco',`resetrgb')\n\nlinethick_(.5)\ndefine(`dimen_', 10)\nelen = dimen_*3/2\n\nOrigin: Here\n T: source(up_ elen, AC) ; llabel(,V_{in},); \"in\" above\n bridge_len = dimen_/2\n W: T.centre + (dimen_/2,0)\n N: W + (bridge_len, bridge_len)\n S: W + (bridge_len, -bridge_len)\n E: S + (bridge_len, bridge_len)\n diode(from W to N)\n diode(from S to E)\n R: resistor(from E + (dimen_,0) down_ dimen_); llabel(+,R_{load},-) # ; \"out\" above\n C: capacitor(down_ R.start.y - R.end.y from 0.5 between E and R.start, C+); rlabel(,C,)\n\n setrgb(1,0,0) # red\n dot(at T.end)\n dot(at C.start)\n line from T.end to (N,T.end) then to N; dot\n diode(to E); dot\n line from E to R.start; dot\n resetrgb\n\n setrgb(0,1,0) # green\n dot(at C.end)\n dot(at R.end)\n ground\n line to (W,Here) then to W; dot\n diode(to S); dot\n line to (Here,T.start) then to T.start; dot\n resetrgb\n\n.PE" +"# tests for tickets below 1000 and for tests without a ticket reference\n0xxx\n\n# tests for tickets in range [1000, 2000[\n1xxx\n\n# tests for tickets in range [2000, 3000[\n2xxx\n\n# tests for tickets in range [3000, 4000[\n3xxx\n\n# tests for tickets in range [4000, 5000[\n4xxx\n\n# collision of long vehicles\n# collision_long_veh (automatically commented due to no test directory being found)\n\n# ticket672 (automatically commented due to no test directory being found)\n\n# ticket777 (automatically commented due to no test directory being found)\n\n# ticket1047 (automatically commented due to no test directory being found)\n\n5xxx\n6xxx\n7xxx" +"\ufeffusing System;\nusing System.Collections.Generic;\nusing System.Linq;\nusing System.Text;\nusing System.Threading.Tasks;\n\nnamespace HearthstoneAI.State\n{\n enum GameStage\n {\n STAGE_UNKNOWN,\n STAGE_GAME_FLOW,\n STAGE_PLAYER_MULLIGAN,\n STAGE_OPPONENT_MULLIGAN,\n STAGE_PLAYER_CHOICE,\n STAGE_OPPONENT_CHOICE\n }\n\n class GameStageHelper\n {\n static public GameStage GetGameStage(Game game)\n {\n ReadOnlyEntity game_entity;\n if (!game.TryGetGameEntity(out game_entity)) return GameStage.STAGE_UNKNOWN;\n\n ReadOnlyEntity player_entity;\n if (!game.TryGetPlayerEntity(out player_entity)) return GameStage.STAGE_UNKNOWN;\n\n ReadOnlyEntity opponent_entity;\n if (!game.TryGetOpponentEntity(out opponent_entity)) return GameStage.STAGE_UNKNOWN;\n\n if (player_entity.GetTagOrDefault(GameTag.MULLIGAN_STATE, (int)TAG_MULLIGAN.INVALID) == (int)TAG_MULLIGAN.INPUT)\n {\n return GameStage.STAGE_PLAYER_MULLIGAN;\n }\n\n if (opponent_entity.GetTagOrDefault(GameTag.MULLIGAN_STATE, (int)TAG_MULLIGAN.INVALID) == (int)TAG_MULLIGAN.INPUT)\n {\n return GameStage.STAGE_OPPONENT_MULLIGAN;\n }\n\n if (!game_entity.HasTag(GameTag.STEP)) return GameStage.STAGE_UNKNOWN;\n\n TAG_STEP game_entity_step = (TAG_STEP)game_entity.GetTag(GameTag.STEP);\n if (game_entity_step != TAG_STEP.MAIN_ACTION) return GameStage.STAGE_GAME_FLOW;\n\n if (player_entity.GetTagOrDefault(GameTag.CURRENT_PLAYER, 0) == 1) return GameStage.STAGE_PLAYER_CHOICE;\n if (opponent_entity.GetTagOrDefault(GameTag.CURRENT_PLAYER, 0) == 1) return GameStage.STAGE_OPPONENT_CHOICE;\n return GameStage.STAGE_UNKNOWN;\n }\n }\n}" +"\ufeffusing System.Collections.Generic;\nusing BizHawk.Emulation.Cores.Components.Z80A;\nusing BizHawk.Emulation.Cores.Sound;\n\nnamespace BizHawk.Emulation.Cores.Computers.SinclairSpectrum\n{\n\t/// <summary>\n\t/// 128K Constructor\n\t/// </summary>\n\tpublic partial class Pentagon128 : SpectrumBase\n\t{\n\t\t/// <summary>\n\t\t/// Main constructor\n\t\t/// </summary>\n\t\tpublic Pentagon128(ZXSpectrum spectrum, Z80A cpu, ZXSpectrum.BorderType borderType, List<byte[]> files, List<JoystickType> joysticks)\n\t\t{\n\t\t\tSpectrum = spectrum;\n\t\t\tCPU = cpu;\n\n\t\t\tCPUMon = new CPUMonitor(this) { machineType = MachineType.Pentagon128 };\n\n\t\t\tROMPaged = 0;\n\t\t\tSHADOWPaged = false;\n\t\t\tRAMPaged = 0;\n\t\t\tPagingDisabled = false;\n\n\t\t\tULADevice = new ScreenPentagon128(this);\n\n\t\t\tBuzzerDevice = new OneBitBeeper(44100, ULADevice.FrameLength, 50, \"SystemBuzzer\");\n\n\t\t\tTapeBuzzer = new OneBitBeeper(44100, ULADevice.FrameLength, 50, \"TapeBuzzer\");\n\n\t\t\tAYDevice = new AY38912(this);\n\t\t\tAYDevice.Init(44100, ULADevice.FrameLength);\n\n\t\t\tKeyboardDevice = new StandardKeyboard(this);\n\n\t\t\tInitJoysticks(joysticks);\n\n\t\t\tTapeDevice = new DatacorderDevice(spectrum.SyncSettings.AutoLoadTape);\n\t\t\tTapeDevice.Init(this);\n\n\t\t\tInitializeMedia(files);\n\t\t}\n\t}\n}" +"Teclas de Atalhos Gen\u00e9ricas\n===========================\n\nAlgumas das teclas de atalhos se aplicam a qualquer painel que esteja\nselecionado. Elas s\u00e3o chamadas teclas de atalhos gen\u00e9ricas.\n\nAqui est\u00e1 a lista de todas as teclas de atalhos gen\u00e9ricas, junto com\nsuas respectivas a\u00e7\u00f5es:\n\n '^R' : Redesenhar -> redesenha pain\u00e9is do calcurse, sendo \u00fatil se\n voc\u00ea redimensionar a sua tela de terminal ou\n quando aparece lixo na vis\u00e3o do calcurse\n '^A' : AdicAgend -> adiciona um agendamento ou um evento\n '^T' : AdiTarefa -> adiciona uma tarefa\n 'T' : -1 Dia -> move para dia anterior\n 't' : +1 Dia -> move para dia seguinte\n 'W' : -1 Semana -> move para semana anterior\n 'w' : +1 Semana -> move para semana seguinte\n 'M' : -1 M\u00eas -> move para m\u00eas anterior\n 'm' : +1 M\u00eas -> move para m\u00eas seguinte\n 'Y' : -1 Ano -> move para ano anterior\n 'y' : +1 Ano -> move para ano seguinte\n '^G' : IrPara hoje -> move para o dia de hoje\n\nAs teclas '^P' e '^N' s\u00e3o usadas para rolar texto para cima e para\nbaixo quando se est\u00e1 dentro de telas espec\u00edficas, como, por exemplo,\nas telas de ajuda. Elas tamb\u00e9m s\u00e3o usadas" +"# Changelog\n\n## [v0.11.2]\n\nMinor:\n\n* Switch to standard MIT SPDX license\n\n## [v0.11.0]\n\nFeatures:\n\n* Add support for `Expect-CT` header. Allows excluding domains that will not have the `Expect-CT` header applied. By default, the `Expect-CT` header will not be applied to localhost. It is also only applied to HTTPS requests \n* Add support for `worker-src` directive for `Content-Security-Policy` header\n\n## [v0.10.0]\n\nBreaking Changes:\n\n* Drop support for ASP.NET Core 1.x\n* Add support for ASP.NET Core 3.0\n\n## [v0.9.0]\n\nFeatures:\n\n* Add support for Nonce generation for `Content-Security-Policy` headers. See [README.md](https://github.com/andrewlock/NetEscapades.AspNetCore.SecurityHeaders/blob/master/README.md#using-nonces-and-generated-hashes-with-content-security-policy) for details\n* Add [TagHelpers](https://www.nuget.org/packages/NetEscapades.AspNetCore.SecurityHeaders.TagHelpers/) library for adding nonces and generating hashes for Razor elements. \n* Allow using HSTS preload with `Strict-Transport-Security`\n* Allow excluding domains from `Strict-Transport-Security`. Similar to the [Microsoft `HstsMiddleware`](https://github.com/aspnet/BasicMiddleware/blob/master/src/Microsoft.AspNetCore.HttpsPolicy/HstsMiddleware.cs), you can skip applying `Strict-Transport-Security` to specific hosts\n\nBreaking Changes:\n\n* All obsolete classes have been removed.\n* Many classes have changed namespace to better reflect their location in the project, and also to aid discovery. If you're using the recommended builders and extension methods, you should not have any build-time breaking changes, but the package is not runtime-compatible with previous versions\n* The `Strict-Transport-Security` header is no longer applied to `localhost` by default. Generally" +"class AccessPermission < ActiveRecord::Base\n belongs_to :swarm\n belongs_to :user\n belongs_to :creator, :class_name => 'User', :foreign_key => 'creator_id'\n\n before_save :downcase_email\n\n validates :email, uniqueness: { scope: :swarm,\n message: \"can only be given permission on a swarm once\" }\n validates :user, uniqueness: { scope: :swarm,\n message: \"can only be given permission on a swarm once\" }, if: :user\n\n def self.update_legacy_permissions_for(user)\n aps = AccessPermission.where(email: user.email)\n aps.each do |ap|\n ap.user = user\n ap.save\n end\n end\n\n def self.can_alter?(swarm, user)\n if user\n user.is_admin? || swarm.users.include?(user) || swarm.access_permissions.find_by(email: user.email)\n end\n end\n\n def self.can_destroy?(swarm,user)\n if user\n user.is_admin? || swarm.owners.include?(user)\n end\n end\n\n def self.can_alter_permissions?(swarm,user)\n if user\n user.is_admin? || swarm.owners.include?(user)\n end\n end\n\n def self.can_see_user_drafts?(current_user, user)\n current_user && ((current_user == user) || current_user.is_admin?)\n end\n private\n\n def downcase_email\n self.email = self.email.downcase\n end\n\nend" +"# 3.1.3 - 2017-11-21\n\n* **Fixed**: Disable non-working Google+ counter (#210 by @apotheosis91).\n* **Fixed**: `updateCounter`: append counter only once (#209 by @RunnX).\n\n# 3.1.2 - 2016-12-02\n\n* **Fixed**: popup position on dual-screen setups (#190 by @mshevtsov).\n\n# 3.1.1 - 2016-08-19\n\n* **Fixed**: Facebook counter API (#184 by @gldmtr, #186 by @xdimedrolx).\n* **Fixed**: ready state check after Twitter counter removal (#182 by @shvetsgroup).\n* Allow HTML in button captions (#109 by @thenexus00).\n\n# 3.1.0 - 2016-01-10\n\n* Fix Google+ counter.\n* Enable HTTPS for Odnoklassniki.\n* `ready.social-likes` now triggered after all counters (#166, by @scream4ik).\n* Open all popups with HTTPS.\n* Update popup sizes.\n\n3.0.15 was the last version available via Bower. Now Social Likes is available only via npm.\n\n# 3.0.15 - 2015-11-21\n\n* Disable discontinued Twitter button (#147).\n* Trigger counter update if number equals zero and zeroes option specified (#151, by @ColCh).\n\n# 3.0.14 - 2015-03-10\n\n* Revert counters changes from previous release because \n* Disable Odnoklassniki counter on HTTPS because of redirect to HTTP.\n* Show counters after 10 sec even if they aren\u2019t ready (instead of waiting for browser\u2019s 30 sec timeout).\n* Don\u2019t add a colon to tweet if it ends on" +"# ebuild\n\n> A low level interface to the Gentoo Portage system.\n\n- Create or update the package manifest:\n\n`ebuild {{path/to/file.ebuild}} manifest`\n\n- Clean the temporary build directories for the build file:\n\n`ebuild {{path/to/file.ebuild}} clean`\n\n- Fetch sources if they do not exist:\n\n`ebuild {{path/to/file.ebuild}} fetch`\n\n- Extract the sources to a temporary build directory:\n\n`ebuild {{path/to/file.ebuild}} unpack`\n\n- Compile the extracted sources:\n\n`ebuild {{path/to/file.ebuild}} compile`\n\n- Install the package to a temporary install directory:\n\n`ebuild {{path/to/file.ebuild}} install`\n\n- Install the temporary files to the live filesystem:\n\n`ebuild {{path/to/file.ebuild}} qmerge`\n\n- Fetch, unpack, compile, install and qmerge the specified ebuild file:\n\n`ebuild {{path/to/file.ebuild}} merge`" +"\"\"\"\nUtils related to processing or registering for webhooks\n\nA model registers itself here if it wants to be in the list of processing\nfunctions for a particular webhook. Each processor will have the ability\nto modify the event object, access event data, and do what it needs to do\n\nregistrations are keyed by top-level event type (e.g. \"invoice\", \"customer\", etc)\nEach registration entry is a list of processors\nEach processor in these lists is a function to be called\nThe function signature is:\n <Event object>\n\nThere is also a \"global registry\" which is just a list of processors (as defined above)\n\nNOTE: global processors are called before other processors.\n\"\"\"\nimport functools\nimport itertools\nfrom collections import defaultdict\n\n__all__ = [\"handler\", \"handler_all\", \"call_handlers\"]\n\n\nregistrations = defaultdict(list)\nregistrations_global = list()\n\n# Legacy. In previous versions of Stripe API, all test events used this ID.\n# Check out issue #779 for more information.\nTEST_EVENT_ID = \"evt_00000000000000\"\n\n\ndef handler(*event_types):\n \"\"\"\n Decorator that registers a function as a webhook handler.\n\n Functions can be registered for event types (e.g. 'customer') or\n fully qualified event sub-types (e.g. 'customer.subscription.deleted').\n\n If an event type is specified, the handler will receive callbacks for\n ALL webhook events of that" +"#\n# Step 1: Build sydent and install dependencies\n#\nFROM docker.io/python:3.8-alpine as builder\n\n# Install dev packages\nRUN apk add --no-cache \\\n build-base \\\n libressl-dev \\\n libffi-dev\n\n# Add user sydent\nRUN addgroup -S -g 993 sydent \\\n && adduser -D --home /sydent -S -u 993 -G sydent -s /bin/ash sydent \\\n && echo \"sydent:$(dd if=/dev/random bs=32 count=1 | base64)\" | chpasswd\n\n# Copy resources\nCOPY --chown=sydent:sydent [\"res\", \"/sydent/res\"]\nCOPY --chown=sydent:sydent [\"scripts\", \"/sydent/scripts\"]\nCOPY --chown=sydent:sydent [\"sydent\", \"/sydent/sydent\"]\nCOPY --chown=sydent:sydent [\"README.rst\", \"setup.cfg\", \"setup.py\", \"/sydent/\"]\n\n# Install dependencies\nRUN cd /sydent \\\n && su sydent -c 'pip install --user --upgrade pip setuptools sentry-sdk' \\\n && su sydent -c 'pip install --user -e .' \\\n && rm -rf /sydent/.cache \\\n && find /sydent -name '*.pyc' -delete\n\n#\n# Step 2: Reduce image size and layers\n#\n\nFROM docker.io/python:3.8-alpine\n\n# Install packages\nRUN apk add --no-cache \\\n libressl \\\n libffi\n\n# Add user sydent and create /data directory\nRUN addgroup -S -g 993 sydent \\\n && adduser -D --home /sydent -S -u 993 -G sydent -s /bin/ash sydent \\\n && echo \"sydent:$(dd if=/dev/random bs=32 count=1 | base64)\" | chpasswd \\\n && mkdir /data \\\n && chown sydent:sydent /data\n\n# Copy sydent\nCOPY --from=builder" +"Changes in Traitlets\n====================\n\nTraitlets 5.0\n-------------\n\n5.0.4\n*****\n\n- Support deprecated use of byte-literals for bytes on the command-line: ``ipython kernel --Session.key=\"b'abc'\"``. The `b` prefix is no longer needed in traitlets 5.0, but is supported for backward-compatibility\n- Improve output of configuration errors, especially when help output would make it hard to find the helpful error message\n\n5.0.3\n*****\n\n- Fix regression in handling `--opt=None` on the CLI for configurable traits\n with `allow_none=True`\n\n5.0.2\n*****\n\n- Fix casting bytes to unicode\n\n5.0.0\n*****\n\n\n(This is an in-progress changelog, please let us know if something is missing/or could be phrased better)\n\nTraitlets 5.0 is a new version of traitlets that accumulate changes over a period of more close to four years; A number of\ninternal refactoring made the internal code structure cleaner and simpler, and greatly improved the diagnostic error\nmessages as well has help and documentation generation.\n\nWe expect no code change needed for any consumer of the Python API (ipywidgets, and alike),\nthough CLI argument parsing have seen a complete rewrite,\nso if you have an application that does use the parsing logic of traitlets you may see changes in behavior,\nand now have access to more features." +"require 'java'\nrequire 'purugin/predicate'\n\nclass cb::CraftWorld\n extend Purugin::Predicate\n \n ##\n # Get the block at the given coordinates\n # === Parameters\n # * _x_,_y_,_z_ - Give three coord. location\n # * _location_ - Provide a location object\n # === Examples\n # e.player.world.block_at(20, 20, 20) #=> Block instance\n # e.player.world.block_at(Location.new(20, 20, 20)) #=> ditto\n #\n def block_at(*r)\n getBlockAt *r\n end\n \n ##\n # Gets the chunk that this location is within.\n # === Parameters\n # * _location_ - location within the chunk you are asking for\n # === Examples\n # e.player.world.chunk_at(me) #=> Location coerced from your location\n # e.player.world.chunk_at(some_location) #=> Give an explicit location\n def chunk_at(location)\n location = location.respond_to?(:to_loc) ? location.to_loc : location\n \n getChunkAt(location)\n end\n \n ##\n # Is the provided chunk currently loaded by this world. This also can be called\n # with the\n # === Parameters\n # * _chunk_or_x_ * - is the chunk instance you are inquiring about or it is x coordinate\n # * z * - is optional for the [x,y] version of this method.\n # === Examples\n # e.player.world.chunk_loaded? some_chunk\n # e.player.world.chunk_loaded?(30, 13)\n def chunk_loaded?(chunk_or_x, z=nil)\n z ? isChunkLoaded(chunk_or_x, z) : isChunkLoaded(chunk_or_x)\n end\n\n ##\n # Is the chunk being actively used by players (also must be loaded).\n #" +"Rank = 0 Free Memory = 13187 MB\nRank = 1 Free Memory = 13187 MB\nRank = 2 Free Memory = 13187 MB\nRank = 3 Free Memory = 13187 MB\nRank = 4 Free Memory = 13187 MB\nRank = 5 Free Memory = 13187 MB\nRank = 6 Free Memory = 13187 MB\nRank = 7 Free Memory = 13187 MB\nRank = 8 Free Memory = 13187 MB\nRank = 9 Free Memory = 13187 MB\nRank = 10 Free Memory = 13187 MB\nRank = 11 Free Memory = 13187 MB\nWARNING ParticleSet::checkBoundBox 3.4> SimulationCellRadius=1.9475\n Using SLOW method for the sphere update. \nWARNING Skip Chiesa estimator due to the lack of support with SoA. Access the correction via AoS at the moment." +" # The Complete Android App Development\n\n### In this course you will learn how to make 17 online games, and apps for Android, using Java. Enroll using a [ 95% discount coupon](https://www.udemy.com/course/android-tutorial-for-beginners/?referralCode=BC559D7466CF58A9CC48). \n\nBellow, list of open source Apps that we build in tutorial\n\n- [Find My Age App](#).\n- [Tic Tac Toe Local App](#).\n- [Calculator App](#).\n- [Pok\u00e9mon Game App](#).\n- [Zoo App](#).\n- [Restaurants App](#).\n- [Find Sunrise time App](#).\n- [My Notes App](#).\n- [Tic Tac Toe using Firebase App](#).\n- [Facebook App using Firebase](#).\n- [MediaPlayer App](#).\n- [Alaram App](#).\n- [Notification Channel App](#).\n- [Light sensor App](#).\n- [Nimbuzz vibrate](#).\n- [Find My Phone App](#).\n- [Twitter App using Php + MySQL](#).\n\n\n\n\n![main](http://attach.alruabye.net/androidTutorialForBeginners/androidTutorialForBeginners.jpg)\n \n \n This\u00a0source will help the beginners to start build their own Android \u00a0apps \u00a0from scratch. By the end of this course\u00a0you will be able to build real world android apps. In this\u00a0course you will learn how to build and design secure\u00a0android apps avoiding Android\u00a0Vulnerabilities, and how to work with android layout tools to design very attractive and responsive\u00a0layout that work with different device size, then you will learn how to use Sqlite as local database storage \u00a0and" +"---\ntitle: Orientation Programme\ncategory: Human Resources\ndate: \"2020-03-12\"\ntags: ['orientation', 'mentor', 'welcome-kit','newcomer']\ndescription: Follow the programme to quickly adapt the employee to the Atolye15 culture and own position.\n\n---\n\n- [ ] Complete the groundwork \nPrepare the equipment and the workspace after the offer has been accepted by the candidate (laptop, mouse, keyboard, etc.)\n\n- [ ] Prepare the technical tools and the communication channels \nYou can refer to [this checklist](https://checklist.atolye15.com/checklist/newcomer-accounts-checklist)\n\n- [ ] Determine or prepare an initial project \nInclude an example project in order to introduce Atolye15 and speed up the adaptation process. \n\n- [ ] Assign a mentor \nDetermine mentor who is going to pass information and lead the way about job content. \n\n- [ ] Prepare the welcome kit \nThe content should be a surprise! \ud83d\udce6 \n\n- [ ] Say Hi \ud83d\udc4b\nThe first working day newcomer is introduced to the whole team and the welcome kit unboxes all together.\n\n- [ ] Information trip \nThe new employee stars working according to the planned schedule and acquaints with employees from different departments and gets information. They also receive the HR guide details at the same time.\n\n- [ ] Welcome on board!" +"\ufeffusing Microsoft.Xna.Framework;\nusing Microsoft.Xna.Framework.Graphics;\nusing System;\nusing Terraria;\nusing Terraria.GameInput;\nusing Terraria.UI;\n\nnamespace ExampleMod.UI\n{\n\t// This class wraps the vanilla ItemSlot class into a UIElement. The ItemSlot class was made before the UI system was made, so it can't be used normally with UIState. \n\t// By wrapping the vanilla ItemSlot class, we can easily use ItemSlot.\n\t// ItemSlot isn't very modder friendly and operates based on a \"Context\" number that dictates how the slot behaves when left, right, or shift clicked and the background used when drawn. \n\t// If you want more control, you might need to write your own UIElement.\n\t// I've added basic functionality for validating the item attempting to be placed in the slot via the validItem Func. \n\t// See ExamplePersonUI for usage and use the Awesomify chat option of Example Person to see in action.\n\tinternal class VanillaItemSlotWrapper : UIElement\n\t{\n\t\tinternal Item Item;\n\t\tprivate readonly int _context;\n\t\tprivate readonly float _scale;\n\t\tinternal Func<Item, bool> ValidItemFunc;\n\n\t\tpublic VanillaItemSlotWrapper(int context = ItemSlot.Context.BankItem, float scale = 1f) {\n\t\t\t_context = context;\n\t\t\t_scale = scale;\n\t\t\tItem = new Item();\n\t\t\tItem.SetDefaults(0);\n\n\t\t\tWidth.Set(Main.inventoryBack9Texture.Width * scale, 0f);\n\t\t\tHeight.Set(Main.inventoryBack9Texture.Height * scale, 0f);\n\t\t}\n\n\t\tprotected override void DrawSelf(SpriteBatch spriteBatch) {\n\t\t\tfloat oldScale =" +"# Calls without locks ... add a caller -> callee line in\n# this file when ...\n# - every call from caller to callee is done outside a PM_LOCK-PM_UNLOCK\n# block in the caller file\n#\n# Do not add an entry if the callee only locks __pmLock_extcall (this is\n# the lowest lock in the hierarchy, so it does not matter what locks are\n# held when __pmLock_extcall is locked as this cannot cause lock inversion\n# or deadlock\n#\n# Note, we are not concerned about any locks that may be held on entry\n# to the caller routine, this is just about locking in the caller and\n# calls to the callee. By adding an entry in this file we are asserting\n# that calls from caller to callee cannot be contributing to any lock\n# inversion problems or violation of the documented lock hierarchy in\n# libpcp.\n#\n# Lines in this file starting with # are treated as comments.\n#\n# Blank lines in this file are ignored\n#\n\n# util.c\npmflush -> pmGetOptionalConfig\nvpmprintf -> pmGetOptionalConfig\n__pmOpenLog -> logreopen\n__pmOpenLog -> logheader\n__pmRotateLog -> logreopen\n__pmRotateLog -> logheader\n__pmOpenLog -> __pmNoMem\n\n# config.c" +"package compiler\n\nimport (\n\t\"errors\"\n\n\t\"github.com/twtiger/gosecco/tree\"\n)\n\n// The boolean compiler uses the stack for simplicity, but we could probably do without\n// It generates suboptimal code, expecting a peephole stage after\n// It will always take jump points as arguments\n// Jump points are arbitary types that represents where to jump.\n// All the different boolean situations can be represented using this structure.\n// The conditional compiler is also a boolean compiler\n\ntype booleanCompilerVisitor struct {\n\tctx *compilerContext\n\terr error\n\ttopLevel bool\n\tjt label\n\tjf label\n}\n\nfunc compileBoolean(ctx *compilerContext, inp tree.Expression, topLevel bool, jt, jf label) error {\n\tv := &booleanCompilerVisitor{ctx: ctx, jt: jt, jf: jf, topLevel: topLevel}\n\tinp.Accept(v)\n\treturn v.err\n}\n\nfunc (s *booleanCompilerVisitor) AcceptAnd(v tree.And) {\n\tnext := s.ctx.newLabel()\n\n\tif err := compileBoolean(s.ctx, v.Left, false, next, s.jf); err != nil {\n\t\ts.err = err\n\t\treturn\n\t}\n\n\ts.ctx.labelHere(next)\n\n\tif err := compileBoolean(s.ctx, v.Right, false, s.jt, s.jf); err != nil {\n\t\ts.err = err\n\t\treturn\n\t}\n}\n\n// AcceptArgument implements Visitor\nfunc (s *booleanCompilerVisitor) AcceptArgument(v tree.Argument) {\n\ts.err = errors.New(\"an argument variable was found in a boolean expression - this is likely a programmer error\")\n}\n\n// AcceptArithmetic implements Visitor\nfunc (s *booleanCompilerVisitor) AcceptArithmetic(v tree.Arithmetic) {\n\ts.err =" +"201\n0.0025 0 \n0 0 \n0.005 0 \n0.0075 0 \n0.01 0 \n0.0125 0 \n0.015 0 \n0.0175 0 \n0.02 0 \n0.0225 0 \n0.025 0 \n0.0275 0 \n0.03 0 \n0.0325 0 \n0.035 0 \n0.0375 0 \n0.04 0 \n0.0425 0 \n0.045 0 \n0.0475 0 \n0.05 0 \n0.0525 0 \n0.055 0 \n0.0575 0 \n0.06 0 \n0.0625 0 \n0.065 0 \n0.0675 0 \n0.07 0 \n0.0725 0 \n0.075 0 \n0.0775 0 \n0.08 0 \n0.0825 0 \n0.085 0 \n0.0875 0 \n0.09 0 \n0.0925 0 \n0.095 0 \n0.0975 0 \n0.1 0 \n0.1025 0 \n0.105 0 \n0.1075 0 \n0.11 0 \n0.1125 0 \n0.115 0 \n0.1175 0 \n0.12 0 \n0.1225 0 \n0.125 0 \n0.1275 0 \n0.13 0 \n0.1325 0 \n0.135 0 \n0.1375 0 \n0.14 0 \n0.1425 0 \n0.145 0 \n0.1475 0 \n0.15 0 \n0.1525 0 \n0.155 0 \n0.1575 0 \n0.16 0 \n0.1625 0 \n0.165 0 \n0.1675 0 \n0.17 0 \n0.1725 0 \n0.175 0 \n0.1775 0 \n0.18 0 \n0.1825 0 \n0.185 0 \n0.1875 0 \n0.19 0 \n0.1925 0 \n0.195 0 \n0.1975 0 \n0.2 0 \n0.2025 0 \n0.205 0 \n0.2075 0 \n0.21 0 \n0.2125 0 \n0.215 0 \n0.2175 0 \n0.22 0 \n0.2225 0 \n0.225 0 \n0.2275 0 \n0.23 0 \n0.2325 0 \n0.235 0 \n0.2375 0 \n0.24 0 \n0.2425 0 \n0.245 0 \n0.2475" +"== 0.7.6 [8/26/2014]\n\nNew:\n\n- Extended renderEach documentation [rprince]\n- Added bower.json file [mkoryak]\n\nFixed:\n\n- Indentation in README [crzidea]\n\n== 0.7.5 [2/22/2014]\n\nNew:\n\n- Allow for different selectors in the push location proxy and only bind them within the app's element [kevingessner]\n- Add AMD support to Handlebars [rikkert]\n- Allow template paths with query parameters [trengrj]\n- Allow JSON paths with query parameters [vicb]\n- Support for new Google Analytics [erpheus]\n- More documentation for contextMatchesOptions [PhilippSoehnlein]\n\nFixed:\n\n- Documentation for onComplete was formatted incorrectly [togakangaroo]\n- AMD issues [rikkert]\n- Hostname issues in IE [teelahti]\n- Clicking an element within a link that targets a different window should be ignored [dtretyakov]\n- Allow using jQuery in noConflict mode [vicb]\n- Test for console.log is broken in IE8 [CodeOtter]\n- When not passing a verb to `route`, the assignment of path is broken [luckydrq]\n\n== 0.7.4 [1/27/2013]\n\nFixed:\n\n- Hotfix for bad jshinting of form matching\n\n== 0.7.3 [1/27/2013]\n\nNew:\n\n- Support for asynchronous chained callbacks [deitch]\n- Make plugins AMD enabled [delrosario]\n- Mixpanel and kissmetrics plugins [jpgarcia]\n\nFixed:\n\n- Better target checking for links and forms [jamesarosen]\n- Changed $.live() to $.delegate() [smithkl42]\n\n== 0.7.2 [10/19/2012]" +"========\nFeatures\n========\n\n- :doc:`Document versioning <../chapters/versioning>`.\n\n - Store many versions of the same document, download or revert to a\n previous version.\n\n- :doc:`Digital signatures <../chapters/signatures>`.\n\n - Check the authenticity of documents by verifying their embedded\n cryptographic signatures or upload detached signatures for document\n signed after they were stored.\n\n- :doc:`Collaboration tools <../parts/collaboration>`.\n\n - Discuss documents, or comment on new versions of a document.\n\n- :doc:`User-defined document metadata <../chapters/metadata>`.\n\n - Several metadata fields can be matched to a document type as per technical,\n legal or structural requirements such as the `Dublin core`_.\n - Metadata fields can have an initial value, which can be static or determined\n by a template code snippet provided by the user.\n\n- :doc:`Documents can be uploaded from different sources <../chapters/sources>`.\n\n - Local file or server side file uploads, multifunctional copier, or even via\n email.\n\n- Batch uploads.\n\n - Many documents can be upload in a single action.\n - Clone a document's metadata for speedier uploads and eliminate repetitive\n data entry.\n\n- Previews for many file formats.\n\n - Mayan EDMS provides image preview generation for many popular file\n formats.\n\n- Office document format support.\n\n - Mayan EDMS can detect the presence of Libre Office and use it" +"# Collections Plugins Directory\n\nThis directory can be used to ship various plugins inside an Ansible collection. Each plugin is placed in a folder that\nis named after the type of plugin it is in. It can also include the `module_utils` and `modules` directory that\nwould contain module utils and modules respectively.\n\nHere is an example directory of the majority of plugins currently supported by Ansible:\n\n```\n\u2514\u2500\u2500 plugins\n \u251c\u2500\u2500 action\n \u251c\u2500\u2500 become\n \u251c\u2500\u2500 cache\n \u251c\u2500\u2500 callback\n \u251c\u2500\u2500 cliconf\n \u251c\u2500\u2500 connection\n \u251c\u2500\u2500 filter\n \u251c\u2500\u2500 httpapi\n \u251c\u2500\u2500 inventory\n \u251c\u2500\u2500 lookup\n \u251c\u2500\u2500 module_utils\n \u251c\u2500\u2500 modules\n \u251c\u2500\u2500 netconf\n \u251c\u2500\u2500 shell\n \u251c\u2500\u2500 strategy\n \u251c\u2500\u2500 terminal\n \u251c\u2500\u2500 test\n \u2514\u2500\u2500 vars\n```\n\nA full list of plugin types can be found at [Working With Plugins](https://docs.ansible.com/ansible/2.9/plugins/plugins.html)." +"DATELINE DECEMBER 10, 2001\n\nFOR IMMEDIATE RELEASE\n\nBOCHS PROJECT RELEASES USABILITY ENHANCED EMULATOR\n\"Release adds easy to use configuration and networking support\"\n\nDecember 10, 2001 (The INTERNET) - The Bochs IA-32 Emulator Project unveiled a new\nversion of the popular Bochs emulator to the public today, improving on the\nstability and ground breaking improvements of Bochs 1.2.\n Bochs 1.3 includes many major enhancements including a powerful menu-based\nconfiguration system and networking support for Linux and Windows NT/2000. Other\nadditions in this release include support for ISO-format disk images, improved\nmouse performance, physical CD-ROM support for all versions of Windows, parallel\nport emulation, enhanced debugger, and many cpu and device model improvements. Bochs\n1.3 also adds native support for Mac OS X and Amiga MorphOS, along with improved\nsupport for BeOS. You can find Bochs binaries for Windows and Linux (along with the\nsource code for UNIX, Linux, Windows, and Mac OS) at http://bochs.sourceforge.net .\n\nABOUT BOCHS\n\n Bochs is the most popular open source x86 emulator available today. Bochs\ncompiles on a number of platforms including UNIX(R) and UNIX-like systems, Windows,\nand MacOS.\n Bochs can be used for many purposes such as running Windows on other\nplatforms, trying a new operating" +"require 'csv'\n\nclass Districts\n\n # Generates a mapping of zip codes to districts.\n # This is done, in theory, just once after redistricting until the next one\n #\n # This mapping is generated by using our boundary services loaded ZCTAs,\n # and congressional districts. It looks up all ZCTAs in the system, then\n # checks each one to see which districts it intersects, and saves to CSV.\n\n # options:\n # zip: do a particular zip code\n\n def self.run(options = {})\n # failsafe, should be many less records than this\n maximum = 100000\n\n page = 1\n per_page = options[:per_page] ? options[:per_page].to_i : 100\n\n zip_count = 0\n\n dest = Environment.config['location']['zips']\n FileUtils.mkdir_p File.dirname(dest)\n if File.exists?(dest)\n \tFileUtils.rm dest\n end\n\n # Zip.delete_all\n\n errors = []\n\n CSV.open(dest, \"w\") do |csv|\n while page < (maximum / per_page)\n puts \"Fetching page #{page}...\"\n\n if options[:zip]\n zips = [options[:zip]]\n else\n zips = zips_for page, per_page, options\n\n if zips.nil?\n Report.failure self, \"Failure paging through zips on page #{page}, aborting\"\n return\n end\n end\n\n zips.each do |zip|\n puts \"[#{zip}] Finding districts...\"\n\n districts = districts_for zip, options\n if districts.nil?\n errors << {zip: zip, message: \"Couldn't load districts intersecting this zip\"}\n next\n end\n\n districts.each do |district|\n csv << [zip, district[:state], district[:district]]\n end\n\n # cache in" +"/**\n * Class CommandInvoker. Takes all CLI operations and calls certain CLI operation depends of variables.\n */\nclass CommandInvoker {\n /**\n * Sets CLI operations (functions).\n * @constructor\n *\n * @param addModule - The function for creating a new module.\n * @param deleteModule - The function for deleting existing module.\n * @param chooseStack - The function for choosing stack of technologies.\n * @param deleteStack - The function for delete stack of technologies.\n */\n constructor(addModule, deleteModule, chooseStack, deleteStack) {\n this.addModule = addModule;\n this.deleteModule = deleteModule;\n this.chooseStack = chooseStack;\n this.deleteStack = deleteStack;\n }\n\n /**\n * Calls CLI operation with correct location.\n *\n * @param func - The func to call.\n * @param location - The location for a new module [client|server|both].\n * @param args - The function for deleting existing module.\n */\n static runCommand(func, { location, ...args }) {\n const runFunc = packageName => func({ ...args, packageName });\n\n if (location === 'both') {\n runFunc('client');\n runFunc('server');\n } else {\n runFunc(location);\n }\n }\n\n /**\n * Runs operation (function) for creating a new module.\n */\n runAddModule(args, options, logger) {\n runOperation(this.addModule, args, options, logger);\n }\n\n /**\n * Runs operation (function) for deleting existing module.\n */\n runDeleteModule(args, options, logger) {\n runOperation(this.deleteModule, args, options, logger);\n }\n\n /**\n *" +"import Foundation\n\n\n// MARK: - NotificationReplyStore\n//\nclass NotificationReplyStore {\n\n /// Shared Instance.\n ///\n static let shared = NotificationReplyStore()\n\n /// Unit Testing Helper: Allows us to hack the current date.\n ///\n private var overridenNow: Date?\n\n /// Our beautiful and private Initializer. In here we'll trigger the cleanup sequence, which removes outdated replies!.\n ///\n private init() {\n purgeOldReplies()\n }\n\n /// Unit Testing Helper: Allows us to hack the current date.\n ///\n init(now: Date) {\n overridenNow = now\n purgeOldReplies(now: now)\n }\n\n\n /// Retrieves the cached reply, for the specified notificationID (if any).\n ///\n func loadReply(for notificationID: String) -> String? {\n return replies[notificationID]\n }\n\n /// Stores a given reply, for the specified notificationID.\n ///\n func store(reply: String, for notificationID: String) {\n replies[notificationID] = reply\n timestamps[notificationID] = overridenNow?.normalizedDate() ?? Date().normalizedDate()\n }\n\n /// Meant for unit testing purposes. Effectively nukes the cached replies.\n ///\n func reset() {\n replies = [:]\n timestamps = [:]\n }\n}\n\n\n// MARK: - Private Methods\n//\nprivate extension NotificationReplyStore {\n\n /// Nukes entries older than `Settings.timeToLiveInDays`.\n ///\n func purgeOldReplies(now: Date = Date()) {\n guard let expiredKeys = findExpiredKeys(inRelationTo: now) else {\n return\n }\n\n removeEntries(with: expiredKeys)\n }\n\n /// Returns the collection of expired keys, when compared to the specified Date.\n ///" +"# Edge Telemetry processor\n\n[Home](readme.md)\n\n## Overview\n\nThe telemetry processor processes all edge telemetry in that it\n\n* Filters out edge events and discards them (processed by the [Edge Event processor](events.md)).\n* Decodes binary PubSub (UADP) network messages\n* Converts PubSub Network messages into simple messages\n* Forwards these and other telemetry to a secondary Event Hub to [forward to applications](ux.md), process through TSI and/or store in [Datalake](cdm.md).\n\nThe edge telemetry processor is an event processor host and can be scaled out to handle the configured number of partitions. It connects to the \"telemetry\" consumer group on IoT Hub.\n\n## Docker image\n\n`docker pull mcr.microsoft.com/iot/industrial-iot-telemetry-processor:preview`" +"{-# LANGUAGE DeriveGeneric #-}\n{-# LANGUAGE GADTs #-}\n-- | Transform a composition by splitting parents.\nmodule Komposition.Composition.Split where\n\nimport Komposition.Prelude\n\nimport Control.Lens\nimport qualified Data.List.NonEmpty as NonEmpty\n\nimport Komposition.Composition\nimport Komposition.Duration\nimport Komposition.Focus\nimport Komposition.MediaType\n\ndata ParallelSplitMode = OnExactClips Int Int | OnClipsNearFocus\n deriving (Eq, Show, Generic)\n\nsplit ::\n ParallelSplitMode\n -> Focus (ToFocusType Timeline)\n -> Timeline a\n -> Maybe (Timeline a, Focus (ToFocusType Timeline))\nsplit = splitOnTimeline\n\nsplitOnTimeline\n :: ParallelSplitMode\n -> SequenceFocus\n -> Timeline a\n -> Maybe (Timeline a, SequenceFocus)\nsplitOnTimeline _ (SequenceFocus sIdx (Just (ParallelFocus pIdx Nothing))) (Timeline seqs) = do\n let newFocus = SequenceFocus sIdx (Just (ParallelFocus (pred pIdx) Nothing))\n Sequence ann pars <- toList seqs `atMay` sIdx\n let (p1, p2) = NonEmpty.splitAt pIdx pars\n ps1 <- nonEmpty p1\n ps2 <- nonEmpty p2\n Just\n ( Timeline (replaceManyAt sIdx [Sequence ann ps1, Sequence ann ps2] seqs)\n , newFocus)\nsplitOnTimeline splitMode (SequenceFocus idx (Just subFocus)) (Timeline seqs) = do\n (seq', newSubFocus) <- splitOnSequence splitMode subFocus =<< toList seqs `atMay` idx\n pure (Timeline (seqs & ix idx .~ seq'), SequenceFocus idx (Just newSubFocus))\nsplitOnTimeline _ _ _ = mzero\n\nsplitOnSequence\n :: ParallelSplitMode\n -> ParallelFocus\n -> Sequence a\n -> Maybe (Sequence a, ParallelFocus)\nsplitOnSequence splitMode (ParallelFocus pIdx (Just (TrackFocus mediaType' (Just" +"structure Statistics : STATISTICS = struct\n\n type stat = {name: string, enabled: bool ref, table: (string, int ref) Util.alist ref}\n\n fun new s r = {name=s,enabled=r,table=ref (Util.emptyAlist())}\n\n fun incr {name,table,enabled} s = \n if !enabled then\n case Util.lookupAlist (!table) s of\n SOME r => r := !r + 1\n | NONE => table := Util.extendAlist (!table) (s,ref 1)\n else ()\n\n fun report {name,enabled,table} =\n if !enabled then\n let val sz = List.foldl (fn ((s,_),a) => (Int.max(size s,a))) 0 (!table)\n val () = Util.prln (\"[\" ^ name ^ \" Statistics Report]\")\n val () = List.app (fn (s,r) => \n let val line = StringCvt.padRight #\" \" sz s ^ \" -> \" ^ Int.toString(!r)\n in Util.prln (\" \" ^ line)\n end) (!table)\n in ()\n end\n else ()\nend" +" ;; Test file for AMD64 inline hooks.\n[BITS 64]\ndefault rel\n\n%macro HookTestCase 2\n db '---', 10\n db 'arch: AMD64', 10\n db 'name: ' ; The name of the test case.\n db %2, 10\n db '...', 10\n db '<bin>'\ntest%1:\n%endmacro\n\n%macro EndHookTestCase 0\n nop\n nop\n nop\n nop\n db '</bin>', 10\n nop\n nop\n nop\n nop\n%endmacro\n\n\nsection .text:\nstart:\n ;; The Start of the .text section is mapped at offset 0.\n ;; We mark it so it can be easily found by the profile\n ;; generation script.\n ;; Pad here by exactly 0x100 chars to ensure that __target__ is at\n ;; offset 0x100.\n db '__start__ '\n db ' '\n db ' '\n db ' '\n\ntarget:\n db '__target__'\n ;; The target for all the jumps.\n ret\n\ntemp: dq 0\n\n ;; Following are test cases for hooks.\n HookTestCase ImmediateJump, \"ImmediateJump\"\n jmp target\n EndHookTestCase\n\n HookTestCase IndirectJump, \"IndirectJump\"\n lea rax, [rel target]\n jmp rax\n EndHookTestCase\n\n HookTestCase IndirectJump2, \"IndirectJump2\"\n lea rax, [target]\n mov [temp], rax\n mov rbx, [temp]\n jmp rbx\n EndHookTestCase\n\n HookTestCase IndirectJump3, \"IndirectJump3\"\n lea rbx, [target]\n jmp rbx\n EndHookTestCase\n\n HookTestCase IndirectJump4, \"IndirectJump4\"\n lea rcx, [target]\n jmp rcx\n EndHookTestCase\n\n HookTestCase IndirectJump5, \"IndirectJump5\"\n lea rcx, [target]\n mov [start], rcx\n jmp [start]\n EndHookTestCase\n\n HookTestCase PushRet, \"PushRet\"" +"<p>\nRadley is based on lettering originally drawn and designed for woodcarved titling work.\nIt was later digitized and extended to be used on the web.\nRadley is a practical face, based on letterforms used by hand carvers who cut letters quickly, efficiently, and with style.\nIt can be used for both titling and text typography.\n</p>\n<p>\nThe basic letterforms in Radley grew out of sketching and designing directly into wood with traditional carving chisels.\nThese were scanned and traced into FontForge and cleaned up digitally, then the character set was expanded.\nThere is something unique about carving letters into wood with traditional hand tools, and hopefully Radley carries some of the original spirit of these hand carved letterforms.\n</p>\n<p>\nSince the initial launch in 2012, Radley was updated by Vernon Adams adding an Italic and support for more Latin languages.\nHe made many glyph refinements throughout the family based on user feedback.\nIn 2017 the family was updated by Marc Foley to complete the work started by Vernon.\n</p>\n<p>\nTo contribute, see <a href=\"https://github.com/googlefonts/RadleyFont\">github.com/googlefonts/RadleyFont</a>\n</p>" +"package hystrix\n\ntype executorPool struct {\n\tName string\n\tMetrics *poolMetrics\n\tMax int\n\tTickets chan *struct{}\n}\n\nconst ConcurrentRequestsLimit = 5000\n\nfunc newExecutorPool(name string) *executorPool {\n\tp := &executorPool{}\n\tp.Name = name\n\tp.Metrics = newPoolMetrics(name)\n\tp.Max = getSettings(name).MaxConcurrentRequests\n\tif p.Max > ConcurrentRequestsLimit {\n\t\tp.Max = ConcurrentRequestsLimit\n\t}\n\n\tp.Tickets = make(chan *struct{}, p.Max)\n\tfor i := 0; i < p.Max; i++ {\n\t\tp.Tickets <- &struct{}{}\n\t}\n\n\treturn p\n}\n\nfunc (p *executorPool) Return(ticket *struct{}) {\n\tif ticket == nil {\n\t\treturn\n\t}\n\n\tp.Metrics.Updates <- poolMetricsUpdate{\n\t\tactiveCount: p.ActiveCount(),\n\t}\n\tp.Tickets <- ticket\n}\n\nfunc (p *executorPool) ActiveCount() int {\n\treturn p.Max - len(p.Tickets)\n}" +"12--Group/12_Group_Large_Group_12_Group_Large_Group_12_15.jpg\n20\n635.269 445.662 36.0747 43.9092 0.997739\n407.332 367.71 46.8934 57.5458 0.99694\n519.605 393.583 32.9069 39.5234 0.99631\n805.862 427.317 35.2679 40.9305 0.995612\n76.062 416.187 35.1402 42.0799 0.995379\n736.555 344.2 30.9573 38.8431 0.994962\n243.631 355.414 35.0664 44.0496 0.994918\n173.256 383.284 30.6572 37.4854 0.993937\n859.331 438.068 34.5137 40.1411 0.992819\n139.224 325.545 37.2591 46.9104 0.992599\n517.539 258.742 34.8766 46.7742 0.992304\n634.474 328.945 34.1324 41.3951 0.99223\n764.317 267.732 36.4325 44.6274 0.991453\n356.862 282.651 38.3657 45.7119 0.988963\n613.142 196.97 39.3177 46.9763 0.986956\n853.06 203.451 42.4352 53.3326 0.986847\n303.986 195.971 43.8529 55.1288 0.986732\n566.693 308.463 29.2115 35.3712 0.984446\n405.58 240.303 40.1848 46.2341 0.980785\n217.786 220.552 37.6437 45.7677 0.977405" +"/**\n * Provides classes representing Python classes.\n */\n\nimport python\n\n/**\n * An (artificial) expression corresponding to a class definition.\n * It is recommended to use `ClassDef` instead.\n */\nclass ClassExpr extends ClassExpr_ {\n /** Gets the metaclass expression */\n Expr getMetaClass() {\n if major_version() = 3\n then\n exists(Keyword metacls |\n this.getAKeyword() = metacls and\n metacls.getArg() = \"metaclass\" and\n result = metacls.getValue()\n )\n else\n exists(Assign a |\n a = this.getInnerScope().getAStmt() and\n a.getATarget().(Name).getId() = \"__metaclass__\" and\n result = a.getValue()\n )\n }\n\n /** Gets the nth keyword argument of this class definition. */\n override DictUnpackingOrKeyword getKeyword(int index) {\n result = this.getKeywords().getItem(index)\n }\n\n /** Gets a keyword argument of this class definition. */\n override DictUnpackingOrKeyword getAKeyword() { result = this.getKeywords().getAnItem() }\n\n override Expr getASubExpression() {\n result = this.getABase() or\n result = this.getAKeyword().getValue() or\n result = this.getKwargs() or\n result = this.getStarargs()\n }\n\n /** Gets a call corresponding to a decorator of this class definition. */\n Call getADecoratorCall() {\n result.getArg(0) = this or\n result.getArg(0) = this.getADecoratorCall()\n }\n\n /** Gets a decorator of this function expression */\n Expr getADecorator() { result = this.getADecoratorCall().getFunc() }\n\n override AstNode getAChildNode() {\n result = this.getASubExpression()\n or\n result = this.getInnerScope()\n }\n\n /** Gets a tuple (*) argument of this class definition." +"#' @importFrom R6 R6Class\n#' @importFrom scales comma\n#' @importFrom ggplot2 scale_color_continuous\nChoropleth = R6Class(\"Choropleth\", \n \n public = list(\n # the key objects for this class\n user.df = NULL, # input from user\n map.df = NULL, # geometry of the map\n choropleth.df = NULL, # result of binding user data with our map data\n \n title = \"\", # title for map\n legend = \"\", # title for legend\n warn = TRUE, # warn user on clipped or missing values \n ggplot_scale = NULL, # override default scale.\n # warning, you need to set \"drop=FALSE\" for insets to render correctly\n \n # a choropleth map is defined by these two variables\n # a data.frame of a map\n # a data.frame that expresses values for regions of each map\n initialize = function(map.df, user.df)\n {\n stopifnot(is.data.frame(map.df))\n stopifnot(\"region\" %in% colnames(map.df))\n self$map.df = map.df\n \n # all input, regardless of map, is just a bunch of (region, value) pairs\n stopifnot(is.data.frame(user.df))\n stopifnot(c(\"region\", \"value\") %in% colnames(user.df))\n self$user.df = user.df\n self$user.df = self$user.df[, c(\"region\", \"value\")]\n \n stopifnot(anyDuplicated(self$user.df$region) == 0)\n \n # things like insets won't color properly if they are characters, and not factors\n if (is.character(self$user.df$value))\n {\n self$user.df$value = as.factor(self$user.df$value)\n }\n \n # initialize the map to the max zoom - i.e. all regions\n self$set_zoom(NULL)" +"# azure-iot-security-tpm.TpmSecurityClient Requirements\n\n## Overview\n`TpmSecurityClient` provides query operations for TPM entities and key manipulation operations for the device registration client.\n\n## Example usage\n\nvar securityClient = new tpmSecurity.TpmSecurityClient();\nvar provisioningClient = ProvisioningDeviceClient.create(provisioningHost, idScope, new provisioningTransport(), securityClient);\n\n### TpmSecurityClient(registrationId?: string, customTpm?: any) [constructor]\n\nThe `TpmSecurityClient` constructor initializes a new instance of a `TpmSecurityClient` object that is used to query TPM entities and perform key manipulation and setup.\n\n**SRS_NODE_TPM_SECURITY_CLIENT_06_001: [** The `registrationId` argument if present will be returned as the `registrationId` for subsequent calls to `getRegistrationId`. **]**\n\n**SRS_NODE_TPM_SECURITY_CLIENT_06_002: [** The `customTpm` argument, if present` will be used at the underlying TPM provider. Otherwise the TPM provider will the tss TPM client with a parameter of `false` for simulator use. **]**\n\n### getEndorsementKey(callback: (err: Error, endorsementKey: string) => void): void\n\nQueries and returns the public portion of the endorsement key in the TPM hardware.\n\n**SRS_NODE_TPM_SECURITY_CLIENT_06_006: [** The `getEndorsementKey` function shall query the TPM hardware and return the `endorsementKey` in the callback. **]**\n\n**SRS_NODE_TPM_SECURITY_CLIENT_06_017: [** If the endorsement key does NOT exist, a new key will be created. **]**\n\n**SRS_NODE_TPM_SECURITY_CLIENT_06_007: [** Any errors from interacting with the TPM hardware will cause a `SecurityDeviceError` to be returned in the `err` parameter of the callback." +"# Monitoring\n\nBy default, the KUDO Kafka Operator comes with the JMX Exporter agent enabled. It also includes a Prometheus Node Exporter sidecar container to\nexport container metrics like the disk usage of persistence volumes used by KUDO Kafka.\n\nWhen the Kafka operator is deployed with the parameter `METRICS_ENABLED=true` (which defaults to `true`) then:\n\n- Each broker bootstraps with the [JMX Exporter](https://github.com/prometheus/jmx_exporter) java agent exposing the metrics at `9094/metrics`,\n along with a [Prometheus Node Exporter](https://github.com/prometheus/node_exporter) sidecar exposing container metrics at `9096/metrics`.\n- Adds a port named `metrics` and `ne-metrics` to the Kafka Service.\n- Adds a label `kudo.dev/servicemonitor: \"true\"` for the service monitor discovery. \n- Mounts a config map with [metrics reporter](https://github.com/kudobuilder/operators/blob/master/repository/kafka/operator/templates/metrics-config.yaml) for the broker container.\n\n\n```\n$ kubectl describe svc kafka-svc\n...\nPort: metrics 9094/TCP\nTargetPort: 9094/TCP\n...\nPort: ne-metrics 9096/TCP\nTargetPort: 9096/TCP\n...\n```\n\n### Using the Prometheus Service Monitor\nTo use the prometheus service monitor, it's necessary to have installed the [prometheus operator](https://github.com/coreos/prometheus-operator) previously in the cluster.\n\nUsers can monitor the Kafka cluster using independent service monitor. Or use the one that comes with KUDO Kafka\n\n`$ kubectl kudo install kafka --instance=kafka-instance -p ADD_SERVICE_MONITOR=true`\n\nOr users can provide their own service-monitor. If Kafka is using the default" +".. _configuration.layers.parameterfilters:\n\nParameter Filters\n=================\n\nParameter filters provide templates for extracting arbitrary parameters from requests. This allows using GeoWebCache in scenarios such as time series data, multiple styles for the same layer or with CQL filters set by the client.\n\nThere are four types of parameter filters:\n\n#. **String filters** (``<stringParameterFilter>``)\n#. **Floating point number filters** (``<floatParameterFilter>``)\n#. **Integer filters** (``<integerParameterFilter>``)\n#. **Regular expression filters** (``<regexParameterFilter>``)\n\nA given layer can have multiple types of parameter filters.\n\nIf a request does not comply with the allowed values of the set parameter, the request will fail, usually with an error such as::\n\n 400: <value> violates filter for parameter <key>\n\nString filter\n-------------\n\nGeoWebCache can also use an allowable list of string values in a parameter filter for a given key. If the string in the request matches one of the string specified in the parameter filter, the request will proceed.\n\nWhen specifying a string filter, three pieces of information are required:\n\n* **Key** (``<key>``). The key is not case sensitive.\n* **Default value** (``<defaultValue>``).\n* **List of strings** (``<values>``, ``<string>``). The strings are case sensitive.\n\nThis information is presented in the following schema inside the ``<wmsLayer>`` tag:\n\n.. code-block:: xml\n\n <parameterFilters>\n <stringParameterFilter>" +"# Feedback\nIf you have input on any of these goals or questions (and anything along these lines), drop a line as a Github issue with the 'feedback' label.\nI would be glad to discuss these with you.\n\n## Goals\nIn making this library, I had a number of design goals.\nI think they offer valuable context for any input you might have.\n\n- Keep the API surface minimal; provide configuration for the loading behaviour, but defer as much of the rendering as possible to the consumer\n- Make the state management in the components explicit and declarative\n- Provide auxiliary documentation for presentational patterns\n\n## Questions\nHere are some things I am wondering about that you might be able to help with:\n- Is the goal of the library apparent?\n- Do you think that the interface supports the goal of the library?\n- Were you able to figure out placeholder strategies? Are there things we could provide as examples?\n- Did you find the examples easy to navigate?" +"# Packers\nPacked programs have often been obfuscated to hide their logic. Since capa cannot handle obfuscation well, results may be misleading or incomplete. If possible, users should unpack input files before analyzing them with capa.\n\nIf capa detects that a program may be packed using its rules it warns the user.\n\n\n# Installers, run-time programs, etc.\ncapa cannot handle installers, run-time programs like .NET applications, or other packaged applications like AutoIt well. This means that the results may be misleading or incomplete.\n\nIf capa detects an installer, run-time program, etc. it warns the user.\n\n\n# Wrapper functions and matches in child functions\nCurrently capa does not handle wrapper functions or other matches in child functions.\n\nConsider this example call tree where `f1` calls a wrapper function `f2` and the `CreateProcess` API. `f2` writes to a file.\n\n```\nf1\n f2 (WriteFile wrapper)\n CreateFile\n WriteFile\n CreateProcess\n```\n\nHere capa does not match a rule that hits on file creation and execution on function `f1`. \n\nSoftware often contains such nested calls because programmers wrap API calls in helper functions or because specific compilers or languages, such as Go, layer calls.\n\nWhile a feature to capture nested functionality is desirable it introduces various" +"#ifndef __XEN_SHARED_H__\n#define __XEN_SHARED_H__\n\n#ifdef CONFIG_COMPAT\n\n#include <compat/xen.h>\n\ntypedef union {\n struct shared_info native;\n struct compat_shared_info compat;\n} shared_info_t;\n\n/*\n * Compat field is never larger than native field, so cast to that as it\n * is the largest memory range it is safe for the caller to modify without\n * further discrimination between compat and native cases.\n */\n#define __shared_info(d, s, field) \\\n (*(!has_32bit_shinfo(d) ? \\\n (typeof(&(s)->compat.field))&(s)->native.field : \\\n &(s)->compat.field))\n\ntypedef union {\n struct vcpu_info native;\n struct compat_vcpu_info compat;\n} vcpu_info_t;\n\n/* As above, cast to compat field type. */\n#define __vcpu_info(v, i, field) \\\n (*(!has_32bit_shinfo((v)->domain) ? \\\n (typeof(&(i)->compat.field))&(i)->native.field : \\\n &(i)->compat.field))\n\n#else\n\ntypedef struct shared_info shared_info_t;\n#define __shared_info(d, s, field) ((s)->field)\n\ntypedef struct vcpu_info vcpu_info_t;\n#define __vcpu_info(v, i, field) ((i)->field)\n\n#endif\n\nextern vcpu_info_t dummy_vcpu_info;\n\n#define shared_info(d, field) __shared_info(d, (d)->shared_info, field)\n#define vcpu_info(v, field) __vcpu_info(v, (v)->vcpu_info, field)\n\n#endif /* __XEN_SHARED_H__ */" +"A1\tA2\tN\tSNP\tZ\nA\tG\t10000.000\trs0\t-5.269\nA\tG\t10000.000\trs1\t0.500\nA\tG\t10000.000\trs2\t-1.079\nA\tG\t10000.000\trs3\t-0.574\nA\tG\t10000.000\trs4\t-0.460\nA\tG\t10000.000\trs5\t-0.671\nA\tG\t10000.000\trs6\t-1.189\nA\tG\t10000.000\trs7\t5.956\nA\tG\t10000.000\trs8\t-0.483\nA\tG\t10000.000\trs9\t-2.527\nA\tG\t10000.000\trs10\t3.342\nA\tG\t10000.000\trs11\t2.692\nA\tG\t10000.000\trs12\t0.048\nA\tG\t10000.000\trs13\t-2.656\nA\tG\t10000.000\trs14\t-2.447\nA\tG\t10000.000\trs15\t-3.625\nA\tG\t10000.000\trs16\t-4.099\nA\tG\t10000.000\trs17\t-0.403\nA\tG\t10000.000\trs18\t-4.030\nA\tG\t10000.000\trs19\t3.820\nA\tG\t10000.000\trs20\t7.138\nA\tG\t10000.000\trs21\t-1.128\nA\tG\t10000.000\trs22\t-0.504\nA\tG\t10000.000\trs23\t-1.229\nA\tG\t10000.000\trs24\t1.220\nA\tG\t10000.000\trs25\t-3.719\nA\tG\t10000.000\trs26\t-0.410\nA\tG\t10000.000\trs27\t-3.132\nA\tG\t10000.000\trs28\t-0.045\nA\tG\t10000.000\trs29\t1.881\nA\tG\t10000.000\trs30\t-0.869\nA\tG\t10000.000\trs31\t3.947\nA\tG\t10000.000\trs32\t5.499\nA\tG\t10000.000\trs33\t-0.183\nA\tG\t10000.000\trs34\t-2.974\nA\tG\t10000.000\trs35\t4.575\nA\tG\t10000.000\trs36\t2.838\nA\tG\t10000.000\trs37\t0.169\nA\tG\t10000.000\trs38\t3.906" +".. _tutorial:\n\nTutorial\n========\n\nThis is a step-by-step tutorial to help you configure OnionBalance.\n\nOnionBalance implements `round-robin` like load balancing on top of Tor\nonion services. A typical OnionBalance deployment will incorporate one management\nservers and multiple backend application servers.\n\nAssumptions\n-----------\n\nYou want to run:\n\n- one or more OnionBalance processes, to perform load balancing, on hosts\n named ``obhost1``, ``obhost2``.\n- two or more Tor processes, to run the Onion Services, on hosts named\n ``torhost1``, ``torhost2``.\n- two or more servers (e.g. web servers) or traditional load balancers on\n hosts named ``webserver1``, ``webserver2``.\n\nScaling up:\n\n- the number of ``obhostX`` can be increased but this will not help handling\n more traffic.\n- the number of ``torhostX`` can be increased up to 60 instances to handle\n more traffic.\n- the number of ``webserverX`` can be increased to handle more traffic until\n the Tor daemons in front of them become the bottleneck.\n\nScaling down:\n\n- the three type of services can be run on the same hosts. The number of hosts\n can scale down to one.\n\nReliability:\n\nContrarily to traditional load balancers, the OnionBalance daemon does not\nreceive and forward traffic. As such, ``obhostX`` does not need to be in\nproximity" +"graph [\n node_count 115\n edge_count 228\n\n node_data size float\n node_data label string\n\n node [\n id 1\n degree 2\n size 1\n label \"waiting_P3\"\n ]\n node [\n id 2\n degree 2\n size 1\n label \"releasing_P3\"\n ]\n node [\n id 3\n degree 2\n size 1\n label \"init_P3\"\n ]\n node [\n id 4\n degree 2\n size 1\n label \"ready_to_consume\"\n ]\n node [\n id 5\n degree 2\n size 1\n label \"ready_to_receive\"\n ]\n node [\n id 6\n degree 2\n size 1\n label \"waiting_P2\"\n ]\n node [\n id 7\n degree 2\n size 1\n label \"releasing_P2\"\n ]\n node [\n id 8\n degree 2\n size 1\n label \"init_P2\"\n ]\n node [\n id 9\n degree 2\n size 1\n label \"waiting_P1\"\n ]\n node [\n id 10\n degree 2\n size 1\n label \"releasing_P1\"\n ]\n node [\n id 11\n degree 2\n size 1\n label \"init_P1\"\n ]\n node [\n id 12\n degree 2\n size 1\n label \"ready_to_send\"\n ]\n node [\n id 13\n degree 2\n size 1\n label \"ready_to_produce\"\n ]\n node [\n id 14\n degree 6\n size 1\n label \"ext_P1\"\n ]\n node [\n id 15\n degree 4\n size 1\n label \"norm_P1\"\n ]\n node [\n id 16\n degree 6\n size 1\n label \"basic_P1\"\n ]\n node [\n id 17\n degree 4\n size 1\n label \"R2_off_P1\"\n ]\n node" +"Description: Composite target for files related to remote administration tools\nAuthor: Drew Ervin, Mathias Frank\nVersion: 1.3\nId: 31cf5a4e-c44c-4457-b11f-74dca73e141b\nRecreateDirectories: true\nTargets:\n -\n Name: RDP Logs\n Category: EventLogs\n Path: RDPLogs.tkape\n Comment: \"Contains Windows Event Logs related to RDP\"\n -\n Name: RDP Cache\n Category: ApplicationData\n Path: RDPCache.tkape\n Comment: \"Contains data cached during recent RDP sessions\"\n -\n Name: LogMeIn\n Category: ApplicationLogs\n Path: LogMeIn.tkape\n -\n Name: VNC\n Category: ApplicationLogs \n Path: VNCLogs.tkape\n -\n Name: Chrome Remote Desktop\n Category: ApplicationLogs\n Path: ApplicationEvents.tkape\n -\n Name: TeamViewer\n Category: ApplicationLogs\n Path: TeamViewerLogs.tkape\n -\n Name: Ammyy\n Category: Ammyy.tkape\n Path: ApplicationLogs\n -\n Name: Kaseya\n Category: ApplicationLogs\n Path: Kaseya.tkape\n -\n Name: ScreenConnect (ConnectWise Control)\n Category: ApplicationLogs\n Path: ScreenConnect.tkape\n -\n Name: Radmin\n Category: ApplicationLogs\n Path: Radmin.tkape\n Comment: \"Radmin Server and Viewer Logs and Chats\"" +"import collections\nimport contextlib\nimport shutil\nimport sys\nimport tempfile\n\nimport numpy\nimport six\n\nimport chainer\n# import classes and functions\nfrom chainer.utils.array import size_of_shape # NOQA\nfrom chainer.utils.array import sum_to # NOQA\nfrom chainer.utils.conv import get_conv_outsize # NOQA\nfrom chainer.utils.conv import get_deconv_outsize # NOQA\nfrom chainer.utils.error import _format_array_props # NOQA\nfrom chainer.utils.experimental import experimental # NOQA\nfrom chainer.utils.meta import enable_final # NOQA\nfrom chainer.utils.meta import final # NOQA\nfrom chainer.utils.nondeterministic import nondeterministic # NOQA\nfrom chainer.utils.sparse import CooMatrix # NOQA\nfrom chainer.utils.sparse import get_order # NOQA\nfrom chainer.utils.sparse import to_coo # NOQA\n\n# The following alias has been moved to chainer/__init__.py in order to break\n# circular imports in Python 2.\n# from chainer.utils.walker_alias import WalkerAlias\n\n\n# TODO(kmaehashi) remove this when `six.moves.collections_abc` is implemented.\n# See: https://github.com/chainer/chainer/issues/5097\ntry:\n collections_abc = collections.abc # type: ignore\nexcept AttributeError: # python <3.3\n collections_abc = collections # type: ignore\n\n\ndef force_array(x, dtype=None):\n # numpy returns a float value (scalar) when a return value of an operator\n # is a 0-dimension array.\n # We need to convert such a value to a 0-dimension array because `Function`\n # object needs to return an `numpy.ndarray`.\n if numpy.isscalar(x):\n if dtype is None:\n return numpy.array(x)\n else:\n return numpy.array(x," +"import { BTree } from './btree'\n\nexport function tokIsKeyword(t :token) :bool {\n return token.keyword_beg < t && t < token.keyword_end\n}\n\nexport function hasIntValue(t :token) :bool {\n return t == token.CHAR\n}\n\nexport function hasByteValue(t :token) :bool {\n return (\n (token.literal_beg < t && t < token.literal_end) ||\n t == token.COMMENT\n )\n}\n\n// Operator precedences\nexport enum prec {\n LOWEST, // = := ! <- ->\n OROR, // ||\n ANDAND, // &&\n CMP, // == != < <= > >=\n ADD, // + - | ^\n MUL, // * / % & &^ << >>\n}\n\nexport enum token {\n // Special tokens\n ILLEGAL = 0,\n EOF,\n COMMENT,\n DIRECTIVE, // #directive\n\n literal_beg,\n // Identifiers and basic type literals\n // (these tokens stand for classes of literals)\n NAME, // main\n NAMEAT, // @foo, @\n literal_num_beg,\n literal_int_beg,\n CHAR, // 'a'\n INT, // 12345\n INT_BIN, // 0b1010\n INT_OCT, // 0o6737\n INT_HEX, // 0xBE3f\n literal_int_end,\n FLOAT, // 123.45\n // RATIO, // 22/7\n literal_num_end,\n STRING, // \"abc\"\n STRING_MULTI, // \"ab\\nc\" \u2014 multi-line\n STRING_PIECE, // \"a ${...} b\" -- the \"a \" part (\" b\" is STRING)\n literal_end,\n\n // Delimiters\n delim_beg,\n LPAREN, // (\n LBRACKET, // [\n LBRACE, // {\n COMMA, // ,\n DOT, //" +"---\ntitle: Show import items\ndescription: Use Show Import Items to expand IntelliSense in Visual Studio for Mac.\nauthor: cobey\nms.author: cobey\nms.date: 03/29/2019\nms.assetid: C7782BF3-016F-4B41-8A81-85FC540A1A8F\nms.custom: video\n---\n# Show import items\n\nVisual Studio for Mac can show all available types, even if they aren't imported to your project, in your IntelliSense completion list. By selecting an item which isn't imported, the correct `using` statement will be added to your source file.\n\n![show import items overview](media/importitems-overview.gif)\n\n## How to enable\n\nTo enable this feature, open **Preferences** via **Visual Studio** > **Preferences** and navigate to **Text Editor** > **IntelliSense**. Check the box **Show import items** to enable additional items in IntelliSense.\n\n![show import items option](media/show-import-items.png)\n\n## Usage\n\nOnce you enable **Show import items**, the process of using the feature to import an item is similar to the normal actions within IntelliSense. As you type code, items that are valid will populate the completion list. This includes items that haven't been imported yet. The items that aren't imported will show their full namespace to the right of the item, allowing you to see which imports you are pulling in to your project.\n\n![show import items list](media/show-import-items-list.png)\n\nIn the IntelliSense list, namespaces" +"[System.Collections.ArrayList]$queue = @()\n# isEmpty?\nif ($queue.Count -eq 0) {\n \"isEmpty? result : the queue is empty\"\n} else {\n \"isEmpty? result : the queue is not empty\"\n}\n\"the queue contains : $queue\"\n$queue += 1 # push\n\"push result : $queue\"\n$queue += 2 # push\n$queue += 3 # push\n\"push result : $queue\"\n\n$queue.RemoveAt(0) # pop\n\"pop result : $queue\"\n\n$queue.RemoveAt(0) # pop\n\"pop result : $queue\"\n\nif ($queue.Count -eq 0) {\n \"isEmpty? result : the queue is empty\"\n} else {\n \"isEmpty? result : the queue is not empty\"\n}\n\"the queue contains : $queue\"" +"# WMI Query that specifies what events we will watch for\n$Query = \"Select * from __InstanceModificationEvent within 3 where TargetInstance ISA 'MSVM_ComputerSystem' `\n and TargetInstance.EnabledState <> PreviousInstance.EnabledState\"\n\n# Script block that will run whenever the event fires\n[ScriptBlock]$Action = {\n $VMName = $Event.SourceEventArgs.NewEvent.TargetInstance.ElementName\n\n switch ($Event.SourceEventArgs.NewEvent.TargetInstance.EnabledState)\n {\n 2 {$vmState = \"running\"}\n 3 {$vmState = \"turned off\"}\n 9 {$vmState = \"paused\"}\n 6 {$vmState = \"in a saved state\"}\n 10 {$vmState = \"starting\"}\n 4 {$vmState = \"stopping\"}\n default {$vmState = \"in an unknown state...\"}\n }\n\n if ($Event.SourceEventArgs.NewEvent.TargetInstance.EnabledState -eq 1)\n {\n $vmState = $Event.SourceEventArgs.NewEvent.TargetInstance.OtherEnabledState\n }\n\n Write-Host \"The virtual machine '$($vmName)' is now $($vmState).\"\n}\n\n# Register for the events\nRegister-WMIEvent -Query $Query -Action $Action -Namespace root\\virtualization\\v2\n\n# To clean up - run \"Get-EventSubscriber | Unregister-Event\"" +"//\n// Takes a screenshot of the given URL, uses named arguments passed in like so: phantomjs raster.js arg=value arg2=value2\n//\n// Arguments:\n// - url - URL to screenshot\n// - output - page to output (e.g. /tmp/output.png)\n// - width [optional] - default 1024 - viewport width\n// - height [optional] - viewport height (see note below on using height)\n// - debug [optional] - default false - whether to do some extra debugging\n// - div [optional] - a selector to use to screenshot to a specific element\n// - resourceWait [optional] - default 300 - the time to wait after the last resource has loaded in MS before taking the screenshot\n// - maxRenderWait [optional] - default 10000 - the maximum time to wait before taking the screenshot, regardless of whether resources are waiting to be loaded\n// - cutoffWait [optional] - default null - the maximum time to wait before cutting everything off and failing...this helps if there is a page taking a long time to load\n// - top, left, width, height [optional] - dimensions to use to screenshot a specific area of the screen\n//\n// == Important notice when providing height ==" +"source: Developing/Merging-Pull-Requests.md\npath: blob/master/doc/\n\n# Merging Pull Requests\n\n### GitHub\n\nWe will now build the monthly change log from our GitHub commits. When\nmerging a commit, please ensure you:\n\n- Click the `Merge pull request` button\n- Give the merge a descriptive but short title\n- For the commit message prepend it with one of the following tags for\n the pull request to appear in the changelog:\n - devices: or newdevice: For new device support.\n - feature: or feat: To indicate this is a new or updated feature\n - webui: or web: To indicate this is an update to the WebUI\n - fix: or bugfix: To show this is a bug fix.\n - refactoring: or refactor: When the changes are refactoring a large\n portion of code\n- You can reference an issue number with `#xyz`, i.e `#1234`\n- Use the `Confirm squash and merge` button to merge.\n\n### Example commits\n\n#### Feature\n\nfeature: Added new availability map #4401\n\n#### New device\n\nnewdevice: Added support for Cisco ASA #4402" +"module Vine where\r\nimport Rumpus\r\n\r\nstart :: Start\r\nstart = do\r\n\r\n let nMax = 100\r\n let branch 0 = return ()\r\n branch n = do\r\n let hue = fromIntegral n / fromIntegral nMax\r\n [x,y,z,w] <- replicateM 4 (randomRange (0,1))\r\n child <- spawnChild $ do\r\n myShape ==> Cube\r\n myPose ==> positionRotation (V3 0 0.4 0)\r\n (axisAngle (V3 x y z) w)\r\n mySize ==> V3 0.1 0.4 0.1\r\n myColor ==> colorHSL hue 0.8 0.8\r\n myUpdate ==> do\r\n now <- (*0.1) <$> getNow\r\n --setSize (V3 0.1 (sin now) 0.1)\r\n setRotation (V3 x y z) (sin now * w)\r\n lift $ inEntity child $ branch (n - 1)\r\n branch nMax\r\n return ()" +"/**\n * @file\n */\n\n#pragma once\n\n#include \"tb_widgets_common.h\"\n\nnamespace tb {\n\n/** TBToggleContainer is a widget that toggles a property when its value\n\tchange between 0 and 1. TOGGLE specifies what property will toggle.\n\tThis is useful f.ex to toggle a whole group of child widgets depending\n\ton the value of some other widget. By connecting the TBToggleContainer\n\twith a widget connection, this can happen completly automatically. */\nclass TBToggleContainer : public TBWidget {\npublic:\n\t// For safe typecasting\n\tTBOBJECT_SUBCLASS(TBToggleContainer, TBWidget);\n\n\tTBToggleContainer();\n\n\t/** Defines what should toggle when the value changes. */\n\tenum TOGGLE {\n\t\tTOGGLE_NOTHING, ///< Nothing happens (the default)\n\t\tTOGGLE_ENABLED, ///< Enabled/disabled state\n\t\tTOGGLE_OPACITY, ///< Opacity 1/0\n\t\tTOGGLE_EXPANDED ///< Expanded/collapsed (In parent axis direction)\n\t};\n\n\t/** Set what should toggle when the value changes. */\n\tvoid setToggle(TOGGLE toggle);\n\tTOGGLE getToggle() const {\n\t\treturn m_toggle;\n\t}\n\n\t/** Set if the toggle state should be inverted. */\n\tvoid setInvert(bool invert);\n\tbool getInvert() const {\n\t\treturn m_invert;\n\t}\n\n\t/** Get the current value, after checking the invert mode. */\n\tbool getIsOn() const {\n\t\treturn m_invert ? m_value == 0 : !(m_value == 0);\n\t}\n\n\t/** Set the value of this widget. 1 will turn on the toggle, 0 will turn it" +"\"\"\"\nDummy layout. Used when somebody creates an `Application` without specifying a\n`Layout`.\n\"\"\"\nfrom prompt_toolkit.formatted_text import HTML\nfrom prompt_toolkit.key_binding import KeyBindings\nfrom prompt_toolkit.key_binding.key_processor import KeyPressEvent\n\nfrom .containers import Window\nfrom .controls import FormattedTextControl\nfrom .dimension import D\nfrom .layout import Layout\n\n__all__ = [\n \"create_dummy_layout\",\n]\n\nE = KeyPressEvent\n\n\ndef create_dummy_layout() -> Layout:\n \"\"\"\n Create a dummy layout for use in an 'Application' that doesn't have a\n layout specified. When ENTER is pressed, the application quits.\n \"\"\"\n kb = KeyBindings()\n\n @kb.add(\"enter\")\n def enter(event: E) -> None:\n event.app.exit()\n\n control = FormattedTextControl(\n HTML(\"No layout specified. Press <reverse>ENTER</reverse> to quit.\"),\n key_bindings=kb,\n )\n window = Window(content=control, height=D(min=1))\n return Layout(container=window, focused_element=window)" +"rand1024-2-6 \trand1024-2\r\n-1 \t-1.0\r\n150 \t150\r\n1024 \t128\r\n 15 15 8\r\n 6 37 8\r\n 43 35 8\r\n 7 68 8\r\n 7 22 8\r\n 49 22 8\r\n 3 37 8\r\n 67 10 8\r\n 27 68 8\r\n 68 22 8\r\n 45 28 8\r\n 13 62 8\r\n 11 52 8\r\n 15 67 8\r\n 26 13 8\r\n 52 13 8\r\n 52 30 8\r\n 32 52 8\r\n 41 2 8\r\n 53 64 8\r\n 60 18 8\r\n 20 68 8\r\n 59 48 8\r\n 49 39 8\r\n 10 34 8\r\n 49 5 8\r\n 42 26 8\r\n 63 33 8\r\n 47 28 8\r\n 67 42 8\r\n 25 5 8\r\n 36 17 8\r\n 6 1 8\r\n 69 32 8\r\n 4 18 8\r\n 59 29 8\r\n 18 13 8\r\n 39 32 8\r\n 45 14 8\r\n 12 53 8\r\n 65 12 8\r\n 5 68 8\r\n 38 32 8\r\n 57 69 8\r\n 40 17 8\r\n 42 14 8\r\n 37 18 8\r\n 50 34 8\r\n 30 30 8\r\n 36 52 8\r\n 53 34 8\r\n 27 54 8\r\n 61 57 8\r\n 2 23 8\r\n 40 24 8\r\n 5 37 8\r\n 31 30 8\r\n 58 4 8\r\n 68 48 8\r\n 15 23 8\r\n 43 19 8\r\n 33 61 8\r\n 25 65 8\r\n 17 53 8" +"AHEAD\nALL\nANYWHERE\nAXS\nBEEGA\nBEYOND\nBIG\nBOARDS\nBRITISH\nCALVIN\nCHABUTON\nCHAXLIE\nCHOC\nCHOCOOLATE\nCHRISTMAS\nCOMFORTABLE\nCRACK\nCRUNCHY\nCTE\nDIANE\nDIFFERENT\nDINING\nEARNING\nEAUTY\nELIZABETH\nENTRANCE\nEPICENTRE\nEVELYN\nEXC\nEXCEPT\nEXODUS\nFAIRPRICE\nFEDEX\nFINE\nFLESHIMP\nFOODS\nFOR\nFOREVER\nFRESHLY\nGELS\nGGULDEN\nGIFTS\nGIRLS\nGRACIOUS\nGRAPHIC\nHAAGEN\nHABA\nHOUR\nINC\nJOINT\nKEEP\nKFC\nKIMMIC\nLUMI\nMAXMARA\nMEAL\nMEETS\nMPH\nNEXT\nOFFICE\nOPENING\nOPTIONS\nORGANI\nOWNDAYS\nPACIFIC\nPARAGON\nPARIS\nPART\nPEPERONI\nPRICE\nPURCHASES\nRAFFLES\nRDJ\nSACOOR\nSALON\nSAYOUR\nSEAS\nSIGN-UP\nSINCERE\nSIS\nSOON\nSPEND\nSPOON\nSPOT\nSSI\nSTAKE\nSTARS\nSUBWAY\nSUPPORT\nTALK\nTANUKI\nTAPE\nTIMBERLAND\nTODS\nTRANSIT\nTURNING\nUOB\nUPP\nWHEELOCK\nYOUR" +"---\nid: version-v1.16.0-throttling-queued-executions\ntitle: Throttling queued executions\nhide_title: true\noriginal_id: throttling-queued-executions\n---\n\n# Throttling queued executions\n\nIn this entry, we will walkthrough how to create an SQS queue for scheduling executions which will be used to limit those executions to a maximum concurrency. And we will see how to configure our Cumulus workflows/rules to use this queue.\n\nWe will also review the architecture of this feature and highlight some implementation notes.\n\nLimiting the number of executions that can be running from a given queue is useful for controlling the cloud resource usage of workflows that may be lower priority, such as granule reingestion or reprocessing campaigns. It could also be useful for preventing workflows from exceeding known resource limits, such as a maximum number of open connections to a data provider.\n\n## Implementing the queue\n\n### Create and deploy the queue\n\n#### Add a new queue\n\nIn a `.tf` file for your [Cumulus deployment](./../deployment/deployment-readme#deploy-the-cumulus-instance), add a new SQS queue:\n\n```hcl\nresource \"aws_sqs_queue\" \"background_job_queue\" {\n name = \"${var.prefix}-backgroundJobQueue\"\n receive_wait_time_seconds = 20\n visibility_timeout_seconds = 60\n}\n```\n\n#### Set maximum executions for the queue\n\nDefine the `throttled_queues` variable for the `cumulus` module in your [Cumulus deployment](./../deployment/deployment-readme#deploy-the-cumulus-instance) to specify the maximum concurrent executions" +"% CVX: Matrix structure definitions and utilities.\n% CVX provides a keyword-based method for definiting matrices\n% with one or more types of structure; e.g.\n% variable X(n,n) symmetric toeplitz tridiagonal;\n% CVX automatically computes an efficient basis for the requested\n% structure. The files in this directory implement those computations.\n%\n% None of these files should be called directly---matrix structure is\n% selected in the VARIABLE declaration; see VARIABLE for more details.\n% Below are the keywords that are available, and the structures they\n% represent. Keywords can be freely combined (see the above example),\n% but of course some combinations are degenerate, yielding only the \n% all-zero matrix; e.g.,\n% variable X(n,n) \n%\n% Structures:\n% banded - (U,L)-banded matrices.\n% complex - Complex variables of all sizes.\n% diagonal - Diagonal matrices.\n% hankel - Hankel matrices.\n% hermitian - Complex Hermitian matrices.\n% lower_bidiagonal - Lower bidiagonal matrices.\n% lower_hessenberg - Lower Hessenberg matrices.\n% lower_triangular - Lower triangular matrices.\n% scaled_identity - Scaled identity: t*eye(n).\n% skew_symmetric - Skew-symmetric matrices.\n% sparse - Matrices with a fixed sparsity pattern.\n% symmetric - Symmetric matrices.\n% toeplitz - Toeplitz matrices.\n% tridiagonal - Tridiagional matrices." +"! Copyright (C) 2007, 2008 Phil Dawes, 2013 John Benediktsson\n! See http://factorcode.org/license.txt for BSD license.\nUSING: combinators fry io io.files io.streams.string kernel\nmake math memoize namespaces sbufs sequences sequences.private\nunicode ;\nIN: csv\n\nSYMBOL: delimiter\n\nCHAR: , delimiter set-global\n\n<PRIVATE\n\nMEMO: field-delimiters ( delimiter -- field-seps quote-seps )\n [ \"\\r\\n\" swap prefix ] [ \"\\r\\\"\\n\" swap prefix ] bi ; inline\n\nDEFER: quoted-field,\n\n: maybe-escaped-quote ( delimeter stream quoted? -- delimiter stream sep/f )\n 2over stream-read1 tuck =\n [ nip ] [\n {\n { CHAR: \\\" [ [ CHAR: \\\" , ] when quoted-field, ] }\n { CHAR: \\n [ ] } ! Error: cr inside string?\n { CHAR: \\r [ ] } ! Error: lf inside string?\n [ [ , drop f maybe-escaped-quote ] when* ]\n } case\n ] if ; inline recursive\n\n: quoted-field, ( delimiter stream -- delimiter stream sep/f )\n \"\\\"\" over stream-read-until drop % t maybe-escaped-quote ;\n\n: quoted-field ( delimiter stream -- sep/f field )\n [ quoted-field, 2nip ] \"\" make ;\n\n: ?trim ( string -- string' )\n dup length [ drop \"\" ] [\n over first-unsafe blank?\n [ drop t ] [ 1 - over nth-unsafe blank? ] if\n [ [" +"from cs50 import get_int\nfrom math import floor\n\n\ndef main():\n digit1 = 0\n digit2 = 0\n num_digits = 0\n sum_of_double_odds = 0\n sum_of_evens = 0\n cc_number = get_int(\"Number: \")\n\n while cc_number > 0:\n digit2 = digit1\n digit1 = cc_number % 10\n\n if num_digits % 2 == 0:\n sum_of_evens += digit1\n else:\n multiple = 2 * digit1\n sum_of_double_odds += (multiple // 10) + (multiple % 10)\n\n cc_number //= 10\n num_digits += 1\n\n is_valid = (sum_of_evens + sum_of_double_odds) % 10 == 0\n first_two_digits = (digit1 * 10) + digit2\n\n if digit1 == 4 and num_digits >= 13 and num_digits <= 16 and is_valid:\n print(\"VISA\\n\")\n elif first_two_digits >= 51 and first_two_digits <= 55 and num_digits == 16 and is_valid:\n print(\"MASTERCARD\\n\")\n elif (first_two_digits == 34 or first_two_digits == 37) and num_digits == 15 and is_valid:\n print(\"AMEX\\n\")\n else:\n print(\"INVALID\\n\")\n\n\nif __name__ == \"__main__\":\n main()" +"#pragma once\r\n\r\nnamespace SipServer {\r\n\r\n/**\r\n * \u5904\u7406\u63a5\u6536\u7684\u6d88\u606f\uff0c\u89e3\u6790\u53ca\u5e94\u7b54\u3002\u53d1\u9001message\u8bf7\u6c42(\u67e5\u8be2\u76ee\u5f55\u3001\u4e91\u53f0\u63a7\u5236)\r\n */\r\nclass CSipMessage\r\n{\r\npublic:\r\n\r\n /**\r\n * \u5904\u7406Message\u4e8b\u4ef6\r\n */\r\n void OnMessage(eXosip_event_t *osipEvent);\r\n\r\n /**\r\n * \u76ee\u5f55\u67e5\u8be2\r\n */\r\n void QueryDirtionary();\r\n\r\n /**\r\n * \u8bbe\u5907\u72b6\u6001\u67e5\u8be2\r\n * @param devID[in] \u8bbe\u5907id\r\n */\r\n void QueryDeviceStatus(string devID);\r\n\r\n /**\r\n * \u8bbe\u5907\u4fe1\u606f\u67e5\u8be2\u8bf7\u6c42\r\n * @param devID[in] \u8bbe\u5907id\r\n */\r\n void QueryDeviceInfo(string devID);\r\n\r\n /**\r\n * \u6587\u4ef6\u76ee\u5f55\u68c0\u7d22\u8bf7\u6c42\r\n * @param devID[in] \u8bbe\u5907id\r\n */\r\n void QueryRecordInfo(string devID, string strStartTime, string strEndTime);\r\n\r\n /**\r\n * \u79fb\u52a8\u8bbe\u5907\u4f4d\u7f6e\u67e5\u8be2\r\n * @param devID[in] \u8bbe\u5907id\r\n */\r\n void QueryMobilePosition(string devID);\r\n\r\n /**\r\n * \u4e91\u53f0\u63a7\u5236\r\n * @param strDevCode[in] \u8bbe\u5907\u7f16\u7801\r\n * @param nInOut[in] \u955c\u5934\u653e\u5927\u7f29\u5c0f 0:\u505c\u6b62 1:\u7f29\u5c0f 2:\u653e\u5927\r\n * @param nUpDown[in] \u955c\u5934\u4e0a\u79fb\u4e0b\u79fb 0:\u505c\u6b62 1:\u4e0a\u79fb 2:\u4e0b\u79fb\r\n * @param nLeftRight[in] \u955c\u5934\u5de6\u79fb\u53f3\u79fb 0:\u505c\u6b62 1:\u5de6\u79fb 2:\u53f3\u79fb\r\n * @param cMoveSpeed[in] \u955c\u5934\u7f29\u653e\u901f\u5ea6\r\n * @param cMoveSpeed[in] \u955c\u5934\u79fb\u52a8\u901f\u5ea6\r\n */\r\n void DeviceControl(string strDevCode,\r\n int nInOut = 0, int nUpDown = 0, int nLeftRight = 0, \r\n uint8_t cInOutSpeed = 0X1, uint8_t cMoveSpeed = 0XFF);\r\n\r\n};\r\n\r\n};" +"(* See docs/unittests.md for documentation on how to use this. *)\n\nlet domTests = ref false\n\nlet process_cmdline_args () =\n let command = ref None in\n Tc.Array.iter Sys.argv ~f:(fun str ->\n match (!command, str) with\n | None, \"--pattern\" ->\n command := Some str\n | None, \"--dom\" ->\n domTests := true\n | None, \"--verbose\" ->\n Tester.verbose := true\n | None, \"--help\" ->\n Js.log\n \"Run Dark's client-side unit tests. Supported arguments:\\n --dom: run the DOM tests (slow)\\n --verbose: print test names\\n --help: Print this message\\n --pattern 'some-regex': Run any test that contains this regex\" ;\n exit 0\n | Some \"--pattern\", str ->\n Tester.pattern := Some (Js.Re.fromString str) ;\n command := None\n | None, _ when Tc.String.endsWith str ~suffix:\"unittests.bs.js\" ->\n (* ignore the filename *)\n ()\n | None, \"/usr/bin/node\" ->\n (* ignore *)\n ()\n | _ ->\n Js.log (\"Unsupported command line argument: \" ^ str))\n\n\n(* See docs/unittests.md for documentation on how to use this. *)\nlet () =\n let open Tester in\n process_cmdline_args () ;\n describe \"Analysis_test\" Analysis_test.run ;\n describe \"APIError test\" Api_error_test.run ;\n describe \"Ast_test\" Ast_test.run ;\n describe \"Autocomplete_test\" Autocomplete_test.run ;\n describe \"Curl_test\" Curl_test.run ;\n describe \"Darkstorage_test\" Darkstorage_test.run ;\n describe \"Encoder test\" Encoder_test.run ;\n describe \"Feature Flag test\" Feature_flag_test.run ;\n describe" +"#pragma once\n#include \"SQLiteNode.h\"\n\nclass SQLiteCommand {\n public:\n\n // This allows for modifying a request passed into the constructor such that we can store it as `const`.\n static SData preprocessRequest(SData&& request);\n\n // If this command was created via an escalation from a peer, this value will point to that peer object. As such,\n // this should only ever be set on leader nodes, though it does not *need* to be set on leader nodes, as they can\n // also accept connections directly from clients.\n // A value of zero is an invalid ID, and is interpreted to mean \"not set\".\n // A negative value indicates a valid ID of an invalid peer (a psuedo-peer, or a disconnected peer), that we can't\n // respond to.\n int64_t initiatingPeerID;\n\n // If this command was created via a direct client connection, this value should be set. This can be set on both\n // leader and followers, but should not be set simultaneously with `initiatingPeerID`. A command was initiated either\n // by a client, or by a peer.\n // A value of zero is an invalid ID, and is interpreted to mean \"not set\".\n // A negative value indicates a valid ID of an invalid" +"# Sharing your project\n\nOnce you've made your project, you can save it the cloud, share it, or embed it on another website.\n\n* Click **More...**, then **Embed Project**:\n* Click **Publish project**. This will make the project publicly available\n* You will then see this information:\n\n## Sharing the URL\n\nYou can share the URL for the project with other people, and they will be able to visit that page to see your project, download it, or edit it. \n\n### ~hint\n\n**Developers:** This page supports [OEmbed](https://oembed.com/).\n\n### ~\n\n## Embedding into a blog or web site\n\nRather than just sharing the link, you can also embed the project so that your visitors can use the simulator, edit blocks or code, or download the project without having to leave your site.\n\n### General instructions\n\nSelect the kind of embedding you would like.\n\n* **Screenshot** - a lightweight screenshot of the blocks that links to the snippet\n* **Editor** - embedded editor with minimal UI\n* **Simulator** - embedded simulator only\n* **Command line** - specific instructions to unpack the project using the [command line](/cli) tools\n\nCopy the HTML for embedding the page from the publish dialog. It will look like" +"import torch\nimport zmq\nimport flatbuffers\nfrom termcolor import colored\n\nfrom . import util, state, __version__\nfrom .distributions import Uniform, Normal, Categorical, Poisson, Bernoulli, Beta, Exponential, Gamma, LogNormal, Binomial, Weibull\nfrom .ppx import Message as ppx_Message\nfrom .ppx import MessageBody as ppx_MessageBody\nfrom .ppx import Tensor as ppx_Tensor\nfrom .ppx import Distribution as ppx_Distribution\nfrom .ppx import Uniform as ppx_Uniform\nfrom .ppx import Normal as ppx_Normal\nfrom .ppx import Categorical as ppx_Categorical\nfrom .ppx import Poisson as ppx_Poisson\nfrom .ppx import Bernoulli as ppx_Bernoulli\nfrom .ppx import Beta as ppx_Beta\nfrom .ppx import Exponential as ppx_Exponential\nfrom .ppx import Gamma as ppx_Gamma\nfrom .ppx import LogNormal as ppx_LogNormal\nfrom .ppx import Binomial as ppx_Binomial\nfrom .ppx import Weibull as ppx_Weibull\nfrom .ppx import Handshake as ppx_Handshake\nfrom .ppx import HandshakeResult as ppx_HandshakeResult\nfrom .ppx import Run as ppx_Run\nfrom .ppx import RunResult as ppx_RunResult\nfrom .ppx import Sample as ppx_Sample\nfrom .ppx import SampleResult as ppx_SampleResult\nfrom .ppx import Observe as ppx_Observe\nfrom .ppx import ObserveResult as ppx_ObserveResult\nfrom .ppx import Tag as ppx_Tag\nfrom .ppx import TagResult as ppx_TagResult\nfrom .ppx import Reset as ppx_Reset\n\n\nclass ZMQRequester():\n def __init__(self, server_address):\n self._server_address = server_address\n self._context = zmq.Context.instance()\n self._socket = self._context.socket(zmq.REQ)" +"(ns puppetlabs.puppetdb.cli.benchmark\n \"Benchmark suite\n\n This command-line utility will simulate catalog submission for a\n population. It requires that a separate, running instance of\n PuppetDB for it to submit catalogs to.\n\n Aspects of a population this tool currently models:\n\n * Number of nodes\n * Run interval\n * How often a host's catalog changes\n * A starting catalog\n\n We attempt to approximate a number of hosts submitting catalogs at\n the specified runinterval with the specified rate-of-churn in\n catalog content.\n\n The list of nodes is modeled in the tool as a set of Clojure\n agents, with one agent per host. Each agent has the following\n state:\n\n {:host ;; the host's name\n :lastrun ;; when the host last sent a catalog\n :catalog ;; the last catalog sent}\n\n When a host needs to submit a new catalog, we determine if the new\n catalog should be different than the previous one (based on a\n user-specified threshold) and send the resulting catalog to\n PuppetDB.\n\n ### Main loop\n\n The main loop is written in the form of a wall-clock\n simulator. Each run through the main loop, we send each agent an\n `update` message with the current wall-clock. Each agent decides\n independently whether or not to submit a catalog during" +"pt-BR:\n about:\n title: 'Sobre REFUGE'\n p1header: 'O que \u00e9 REFUGE restrooms?'\n p1:\n first: 'Refuge Restrooms \u00e9 uma aplica\u00e7\u00e3o web que busca fornecer acesso a banheiros seguros para indiv\u00edduos transg\u00eaneros, intersexo ou n\u00e3o-conformantes de g\u00eanero. Pessoas podem procurar por banheiros por proximidade, adicionar novos lugares e tamb\u00e9m comentar e avaliar lugares existentes.'\n second: 'Nossa lideran\u00e7a \u00e9 trans e buscamos criar uma comunidade focada n\u00e3o apenas em achar acesso a banheiros seguros, mas tamb\u00e9m advogar pela seguran\u00e7a de pessoas transg\u00eaneras, intersexo ou n\u00e3o-conformantes de g\u00eanero.'\n p2header: 'Onde voc\u00eas pegaram todos esses dados?'\n p2: 'As primeiras 4500 entradas s\u00e3o gra\u00e7as ao velho banco de dados Safe2Pee. O resto do nosso banco de dados foi gerado por nossos usu\u00e1rios. Se voc\u00ea conhece um banheiro neutro ou seguro, por favor adicione ao nosso banco de dados!'\n p3header: 'Por que voc\u00eas escolheram o nome REFUGE?'\n p3: 'N\u00f3s acreditamos firmemente que todas as pessoas t\u00eam o direito de usar o banheiro com seguran\u00e7a e n\u00f3s quer\u00edamos que o nome da nossa aplica\u00e7\u00e3o tivesse um pouco da mesma dignidade que queremos dar aos nossos usu\u00e1rios. Falando de um modo simples, gostar\u00edamos de fornecer um lugar de ref\u00fagio na sua necessidade.'\n p4header: \"Qual a import\u00e2ncia dos banheiros afinal, e" +"\"\nI am ZnMessageBenchmark helps to test the benchmarking and profiling of ZnMessage writing and reading.\n\nInstance Variables\n\tbuffer:\t\t\t\t\t<ByteArray>\n\tmessage:\t\t\t\t<ZnObject>\n\trepresentation:\t\t<ByteArray>\n\nZnMessageBenchmark new\n\tsimpleRequest;\n\twrite: 10000.\n\nZnMessageBenchmark new\n\tsimpleRequest;\n\twriteRepresentation;\n\tread: 10000.\n\nZnMessageBenchmark new\n\tsimpleResponse;\n\twrite: 10000.\n\nZnMessageBenchmark new\n\tsimpleResponse;\n\twriteRepresentation;\n\tread: 10000.\n\n\"\nClass {\n\t#name : #ZnMessageBenchmark,\n\t#superclass : #Object,\n\t#instVars : [\n\t\t'message',\n\t\t'representation',\n\t\t'buffer'\n\t],\n\t#category : #'Zinc-Tests'\n}\n\n{ #category : #accessing }\nZnMessageBenchmark class >> bench: messages [\n\t| results |\n\tresults := Dictionary new.\n\tmessages \n\t\tdo: [ :each | | bench report |\n\t\t\tbench := self new.\n\t\t\tbench perform: each.\n\t\t\tbench writeRepresentation.\n\t\t\treport := 'Writing {1} - Reading {2}' format: { bench benchWrite. bench benchRead }.\n\t\t\tresults at: each put: report ] \n\t\tdisplayingProgress: 'Benchmarking...'.\n\t^ results\n]\n\n{ #category : #accessing }\nZnMessageBenchmark class >> benchAll [\n\t^ self bench: self messages\n]\n\n{ #category : #accessing }\nZnMessageBenchmark class >> messages [\n\t^ self requests , self responses\n]\n\n{ #category : #accessing }\nZnMessageBenchmark class >> requests [\n\t^ #( simpleRequest standardRequest postRequest )\n]\n\n{ #category : #accessing }\nZnMessageBenchmark class >> responses [\n\t^ #( \n\t\tsimpleResponse \n\t\ttextResponse8k textResponse64k textResponse256k\n\t\ttextResponseWide8k textResponseWide64k textResponseWide256k\n\t\tasciiResponse8k asciiResponse64k asciiResponse256k\n\t\tbinaryResponse8k" +"\n#ifndef SAGE_MIME_H\n#define SAGE_MIME_H\n\n#include <QString>\n\nconst QString SG_NODE_MIMETYPE = \"application/SgNode\";\nconst QString SG_NODE_BINARY_MIMETYPE = \"application/SgNode-binary\";\nconst QString SG_NODE_SOURCE_MIMETYPE = \"application/SgNode-source\";\n\nclass SgNode;\n\nclass QMimeData;\n\ntypedef std::vector<SgNode *> SgNodeVector;\n\n/// creates MIME Data out of a SgNode\n/// depending on the node, the base type (source or binary) gets set by finding\n/// the appropriate SgFile\n/// the other file gets set if the appropriate Link Attribute is set\nQMimeData *createSageMimeData( SgNode *node );\n\n/// extract the MIME data, returns NULL if not set\nSgNode *getGeneralNode( const QMimeData *data );\nSgNodeVector getSourceNodes( const QMimeData *data );\nSgNodeVector getBinaryNodes( const QMimeData *data );\n\nSgNodeVector getNodes( const QMimeData *data, const QString& type );\n\n#endif" +"const Edge = require('../lib/edge.js')\nconst t = require('tap')\n\n// slight hack to snapshot the getters\n// would be nice if tcompare.format showed these by default,\n// but it's tricky to know when to show non-iterables and\n// when not to. Really, it'd be best if class getters were\n// iterable by default, or had some syntax to allow it, but\n// that's outside my sphere of direct influence, and using\n// Object.defineProperty(this, 'foo', { get ... }) is a pita.\nt.formatSnapshot = obj =>\n obj instanceof Edge ? {\n ...obj,\n spec: obj.spec,\n name: obj.name,\n type: obj.type,\n valid: obj.valid,\n error: obj.error,\n from: obj.from,\n to: obj.to,\n peer: obj.peer,\n dev: obj.dev,\n optional: obj.optional,\n workspace: obj.workspace,\n missing: obj.missing,\n peerLocal: obj.peerLocal,\n invalid: obj.invalid,\n __proto__: { constructor: Edge },\n } : obj\n\nconst reset = node => {\n node.edgesOut = new Map()\n node.edgesIn = new Set()\n}\n\n// mock nodes\nconst top = {\n edgesOut: new Map(),\n edgesIn: new Set(),\n package: { name: 'top', version: '1.2.3' },\n isTop: true,\n parent: null,\n resolve (n) {\n return n === 'a' ? a : n === 'b' ? b : null\n },\n addEdgeOut (edge) {\n this.edgesOut.set(edge.name, edge)\n },\n addEdgeIn (edge) {\n this.edgesIn.add(edge)\n },\n}\n\nconst a = {\n edgesOut:" +"from dino.hooks.ban import *\nfrom dino.hooks.connect import *\nfrom dino.hooks.create import *\nfrom dino.hooks.delete import *\nfrom dino.hooks.disconnect import *\nfrom dino.hooks.get_acl import *\nfrom dino.hooks.history import *\nfrom dino.hooks.invite import *\nfrom dino.hooks.join import *\nfrom dino.hooks.kick import *\nfrom dino.hooks.leave import *\nfrom dino.hooks.list_channels import *\nfrom dino.hooks.list_rooms import *\nfrom dino.hooks.login import *\nfrom dino.hooks.message import *\nfrom dino.hooks.remove_room import *\nfrom dino.hooks.request_admin import *\nfrom dino.hooks.set_acl import *\nfrom dino.hooks.status import *\nfrom dino.hooks.startup import *\nfrom dino.hooks.users_in_room import *\nfrom dino.hooks.whisper import *\nfrom dino.hooks.update_user_info import *\nfrom dino.hooks.read import *\nfrom dino.hooks.received import *\nfrom dino.hooks.heartbeat import *" +"require 'diffy'\n\nmodule Refinery\n module Pages\n # Knows how to build the html for a section. A section is part of the visible html, that has\n # content wrapped in some particular markup. Construct with the relevant options, and then\n # call wrapped_html to get the resultant html.\n #\n # The content rendered will usually be the value of fallback_html, unless an override_html\n # is specified. However, on rendering, you can elect not display sections that have no\n # override_html by passing in false for can_use_fallback.\n #\n # Sections may be hidden, in which case they wont display at all.\n class SectionPresenter\n include ActionView::Helpers::TagHelper\n include ActionView::Helpers::SanitizeHelper\n\n def initialize(initial_hash = {})\n { logger: Rails.logger }.merge(initial_hash).map do |key, value|\n send(\"#{key}=\", value)\n end\n end\n\n attr_reader :id, :fallback_html, :hidden\n alias_method :hidden?, :hidden\n attr_accessor :override_html\n\n def visible?\n !hidden?\n end\n\n def has_content?(can_use_fallback = true)\n visible? && content_html(can_use_fallback).present?\n end\n\n def wrapped_html(can_use_fallback = true)\n return if hidden?\n\n content = content_html(can_use_fallback)\n if content.present?\n wrap_content_in_tag(content)\n end\n end\n\n def hide\n self.hidden = true\n end\n\n def not_present_css_class\n \"no_#{id}\"\n end\n\n protected\n\n def content_html(can_use_fallback)\n override_html.presence || html_from_fallback(can_use_fallback)\n end\n\n def html_from_fallback(can_use_fallback)\n fallback_html.presence if can_use_fallback\n end\n\n private\n\n attr_accessor :logger\n attr_writer :id, :fallback_html, :hidden\n\n def wrap_content_in_tag(content)\n content_tag(:section, content_tag(:div, sanitize_content(content), :class => 'inner'), :id => id)\n end\n\n def" +"# edit the \"button =\" part for each entry according to your remote, and stick\n# this stuff in ~/.lircrc\n\nbegin\n\tprog = Rhythmbox\n\tremote = *\n\tbutton = KEY_PLAY\n\trepeat = 1\n\tconfig = play\nend\n\nbegin\n\tprog = Rhythmbox\n\tremote = *\n\tbutton = KEY_PAUSE\n\trepeat = 0\n\tconfig = pause\nend\n\nbegin\n\tprog = Rhythmbox\n\tremote = *\n\tbutton = KEY_PLAYPAUSE\n\trepeat = 1\n\tconfig = playpause\nend\n\nbegin\n\tprog = Rhythmbox\n\tremote = *\n\tbutton = KEY_STOP\n\trepeat = 1\n\tconfig = stop\nend\n\n#FIXME\n#begin\n#\tprog = Rhythmbox\n#\tremote = *\n#\tbutton = \n#\trepeat = 1\n#\tconfig = shuffle\n#end\n\n#FIXME\n#begin\n#\tprog = Rhythmbox\n#\tremote = *\n#\tbutton = \n#\trepeat = 1\n#\tconfig = repeat\n#end\n\nbegin\n\tprog = Rhythmbox\n\tremote = *\n\tbutton = KEY_NEXT\n\trepeat = 1\n\tconfig = next\nend\n\nbegin\n\tprog = Rhythmbox\n\tremote = *\n\tbutton = KEY_PREVIOUS\n\trepeat = 1\n\tconfig = previous\nend\n\nbegin\n\tprog = Rhythmbox\n\tremote = *\n\tbutton = KEY_FASTFORWARD\n\trepeat = 1\n\tconfig = seek_forward\nend\n\nbegin\n\tprog = Rhythmbox\n\tremote = *\n\tbutton = KEY_REWIND\n\trepeat = 1\n\tconfig = seek_backward\nend" +"#pragma once\n\n#include <string>\nnamespace ParaEngine\n{\n\tusing namespace std;\n\t/**\n\t* helper functions for editing. It is a group of static helper functions for scene editing. \n\t*/\n\tclass PE_CORE_DECL CEditorHelper\n\t{\n\tpublic:\n\t\t/**\n\t\t* return the file name from a given script text. \n\t\t* It searches \"NPL.load(\\\"\" in the script text, and returns its first string parameter. \n\t\t* @param output: output file name\n\t\t* @param sScript: script text. if the file name is not found within the first MAX_PATH=260 characters, \"\" is returned. \n\t\t* @param bRelativePath: if true, relative path is returned. if false, the complete NPL path in the script is returned, which may contain (gl) prefix.\n\t\t* @return: true if found. \n\t\t*/\n\t\tstatic bool SearchFileNameInScript(string & output, const char* sScript, bool bRelativePath = true);\n\t\t/** same as SearchFileNameInScript(). only used for API exportation. Not thread-safe*/\n\t\tstatic const char* SearchFileNameInScript_(const char* sScript, bool bRelativePath);\n\n\t\t/**\n\t\t* Open a given file with the default registered editor in the game engine. \n\t\t* @param sFileName: file name to be opened by the default editor.\n\t\t* @param bWaitOnReturn: if false, the function returns immediately; otherwise it will wait for the editor to return. \n\t\t* @return true if opened. \n\t\t*/\n\t\tstatic bool OpenWithDefaultEditor(" +"\nclass Foo(x: Int) {}\ncase class Bar(y: Int) extends Foo(y);\n\n\ntrait T {}\ntrait U {}\nclass C() {}\n\n\ntrait T1;\ntrait T2 {}\ntrait T5 extends T;\ntrait T6 extends T {}\ntrait T7 extends T with U;\ntrait T8 extends T with U {}\n\nclass C1();\nclass C2() {}\nclass C5() extends C();\nclass C6() extends C() {}\nclass C7() extends C() with U;\nclass C8() extends C() with U {}\n\ncase class D1();\ncase class D2() {}\ncase class D5() extends C();\ncase class D6() extends C() {}\ncase class D7() extends C() with U;\ncase class D8() extends C() with U {}\n\nobject M1;\nobject M2 {}\nobject M5 extends C();\nobject M6 extends C() {}\nobject M7 extends C() with U;\nobject M8 extends C() with U {}\n\n\n\n-----" +"package org.kframework.kdoc\n\nimport org.kframework.attributes.Att\nimport org.kframework.definition.{Module, NonTerminal, RegexTerminal, Terminal}\nimport org.kframework.frontend.K\nimport org.kframework.frontend.Unapply._\n\n/**\n * Takes a K term with the grammar described by Module module, and unparses it to its latex representation.\n * For each term:\n * - when its production has latex attribute, it uses that attribute for unparsing\n * - otherwise, it unparses by concatenating the Production's items with the separator parameter as a separator\n * @param module\n * @param separator\n */\nclass KtoLatex(module: Module, separator: String = \" \") extends ((K) => String) {\n var x = 0\n def apply(k: K): String = k match {\n case KApply(l, children) =>\n val latexAtts = module.productionsFor(l).flatMap(_.att.get[String](Att.latex))\n val latex = latexAtts.size match {\n case 0 => // no latex annotation\n val possibleLatexes = module.productionsFor(l).map(_.items.foldLeft((Seq[String](), 1)) {\n case ((r, i), t: Terminal) => (r :+ t.value, i)\n case ((r, i), t: RegexTerminal) => (r :+ t.regex, i) //TODO: we probably want something better here\n case ((r, i), nt: NonTerminal) => (r :+ \"#\" + i, i + 1)\n }).map(_._1.mkString(separator))\n possibleLatexes.size match {\n case 0 => throw new AssertionError(\"Could not find a label for \" + l)\n case 1 => possibleLatexes.head\n case _ => throw new AssertionError(\"Too productions for klabel \"" +"FROM ubuntu:18.04\nMAINTAINER Intrigue Team <hello@intrigue.io>\nENV DEBIAN_FRONTEND noninteractive\n\nUSER root\n\n# Get up to date\nRUN apt-get -y update && apt-get -y install sudo\n\n# Set us up!\nWORKDIR /core\n\n# Set up intrigue\nENV BUNDLE_JOBS=12\nENV PATH /root/.rbenv/bin:$PATH\nENV IDIR=/core\nENV DEBIAN_FRONTEND=noninteractive\n\n# create a volume\nVOLUME /data\n\n# copy intrigue code\nCOPY . /core/\n\n# install intrigue-specific software & config\nRUN /bin/bash /core/util/bootstrap-worker.sh\n\n# Remove the config file so one generates on startup (useful when testing)\nRUN if [ -e /core/config/config.json ]; then rm /core/config/config.json; fi\n\n# Expose the port\nEXPOSE 7777\n\nRUN chmod +x /core/util/start-worker.sh\nENTRYPOINT [\"/core/util/start-worker.sh\"]" +"Intersil ISL1209/19 I2C RTC/Alarm chip with event in\n\nISL12X9 have additional pins EVIN and #EVDET for tamper detection, while the\nISL1208 and ISL1218 do not. They are all use the same driver with the bindings\ndescribed here, with chip specific properties as noted.\n\nRequired properties supported by the device:\n - \"compatible\": Should be one of the following:\n\t\t- \"isil,isl1208\"\n\t\t- \"isil,isl1209\"\n\t\t- \"isil,isl1218\"\n\t\t- \"isil,isl1219\"\n - \"reg\": I2C bus address of the device\n\nOptional properties:\n - \"interrupt-names\": list which may contains \"irq\" and \"evdet\"\n\tevdet applies to isl1209 and isl1219 only\n - \"interrupts\": list of interrupts for \"irq\" and \"evdet\"\n\tevdet applies to isl1209 and isl1219 only\n - \"isil,ev-evienb\": Enable or disable internal pull on EVIN pin\n\tApplies to isl1209 and isl1219 only\n\tPossible values are 0 and 1\n\tValue 0 enables internal pull-up on evin pin, 1 disables it.\n\tDefault will leave the non-volatile configuration of the pullup\n\tas is.\n\nExample isl1219 node with #IRQ pin connected to SoC gpio1 pin12 and #EVDET pin\nconnected to SoC gpio2 pin 24 and internal pull-up enabled in EVIN pin.\n\n\tisl1219: rtc@68 {\n\t\tcompatible = \"isil,isl1219\";\n\t\treg = <0x68>;\n\t\tinterrupt-names = \"irq\", \"evdet\";\n\t\tinterrupts-extended = <&gpio1 12 IRQ_TYPE_EDGE_FALLING>,\n\t\t\t<&gpio2 24 IRQ_TYPE_EDGE_FALLING>;\n\t\tisil,ev-evienb" +"2-26\n> Prefer Git over SVN because branches are so cheap so it is easier to do the workflow of doing stuff in the branch then moving to master afterwards.\n> Arranging models: includes, constants, attrs, validations, relationships, `accepts_nested_attributes_for`\n> Feels good to regex the rework summarization thing.\n> Read up on replacing conditional with polymorphism. This will def help on the authorizations and differing user levels in Radiograph. Learned on how views are interpolated by Rails' AR.\n> Read up on STI vs. polymorphism on modelling inherited classes. Though I took a long time it is worth it, I think.\n> Read up on CanCan but I'm not going to use it yet. Primarily because there are not yet that many features.\n> There is such a thing as an abstract AR::Base class.\n> Model generators also create a migration\n> Delegates make methods available across models.\n> to create a table: rails generate migration CreatePosts\n> attr_accessible is so that we can't let users update values\n> Don't forget the confirm thing on the link_to when you have to delete: `link_to \"Delete\", post_path(@post), :method => :delete, :confirm => \"Are you sure?\"`\n> Active Record: Post.find(478, 1134) to get" +"IfsCompose\n----------\n\nIfsCompose is a plug-in for GIMP that allows\nthe creation of Iterated Function System fractals by direct\nmanipulation onscreen of the component transforms.\n\n\nIFS Fractals\n------------\n\nYou may be familiar with IFS's from the screen\nhack 'Flame'. They are also the basis of fractal image compression.\n\nFor a brief introduction to IFS's see Foley and van Dam, et\nal,. _Computer Graphics, Principles and Practice_, 2nd Ed., \n(Addison Wesley, 1990).\n\nThe standard references in the field are Michael Barnsley's books (though\nI haven't looked at them yet):\n\nM. Barnsley, _Fractals Everywhere_, Academic Press Inc., 1988.\nM. Barnsley and L. Hurd, _Fractal Image Compression_, Jones and\nBartlett.\n\nBriefly, you take a point and repeatedly apply one of a set of\ntransformations to it, choosing randomly between them, and plot the\npoint at each step. An interesting result (the Collage Theorem) says\nthat if you can find a set of transformations that break up an image\ninto smaller copies of itself, then the resulting fractal exactly\nreproduces the original image. For example, here is a classic image\nof a leaf and the same image with the four component transforms\ncolored distinctively.\n\nBut the best way to appreciate this may to install" +"/**\n@mainpage\n\n@section c_intro_sec Introduction\n\n\n\n@section c_install_sec Building and Installation\n\n@subsection starting Getting Started \n\nFor general instructions about how to install CCNx, please refer to\nthe top-level README file.\n\n@subsection howitworks How the c build works \n\nThe C build sticks as closely as possible to the common subset\nof make(1) features, using the Posix specifications as a guide.\n\nSince most (or all) make implementations have their own extensions, staying\nwithin this subset is a continual challenge. The CCNx build is tested with\nGNU make, BSD make, and Solaris make.\n\nThe ./configure script's main job is to build the conf.mk file that defines the\nconfigurable make macros. This is done largely just by using the\noutput of the uname command, but scripted configuration is possible as well.\nThe csrc/conf/ directory contains the system-specific make fragments and scripts.\n\nIf you need to override the configured values, put them into a file\nnamed csrc/conf/local.mk before you execute ./configure in csrc/.\n\nIt is also possible to define certain environment variables\nto modify default guesses of the ./configure script.\nThe most useful of these are:\n- CFLAGS (default: -g)\n- INSTALL_BASE (default: /usr/local)\n- AR (default: leave unset to be supplied by make)\n- CC" +"ALE\nALL\nALWAYS\nANGKOK\nBABY\nBEYOND\nBOARDS\nBODISCIENCES\nBOERIEN\nBRAND\nBROTHERS\nBROWN\nBUD\nCAVENAGH\nCBTL\nCINEPLE\nCITIBANK\nCONNECTOR\nCOOK\nDIAPERS\nDINING\nDIRECTORY\nDISCOUNTS\nDORAEMAN\nECCO\nERA\nESPLANADE\nEXIT\nFACE\nFIBRE\nFINEST\nFREE\nGIORDAN\nGUCCI\nHAIR\nHEAD\nHOU\nINDIA\nINDIAN\nINTO\nJCB\nKINOKUNIVA\nLEAVES\nLINK\nLONDON\nLOST\nMAARTEN\nMARC\nMARINA\nMASTER\nMONTH\nMORE\nMOVE\nNALE\nNETB\nNOX\nOCEAN\nOPPING\nOPS\nOPTIONS\nORDER\nOUTLET\nPERKINS\nPLAZA\nPOTATOES\nREPUBLIC\nROOK\nSAHI\nSAY\nSEA\nSHOPPING\nSHU\nSINGPOST\nSK-II\nSPA\nSQUARE\nSTATION\nSTERN\nSTICKY\nSTONE\nSTORE\nSWATC\nSWEATS\nTAGH\nTEA\nTEMT\nTHIRSTY\nTHIS\nTIMEWISE\nTOBLERONE\nTOKYO\nTOWER\nTOWN\nTWO\nUNI\nVUITTON\nWAY\nWOOBO\nWORK\nWORKSHOPELEM" +"---\ntitle: System Props\n---\n\nimport {PropsList, COMMON, LAYOUT, BORDER, TYPOGRAPHY, FLEX, POSITION, GRID} from '../components'\n\nPrimer React components utilize what we call \"system props\" to apply a standard set of props to each component. Using [styled-system](https://github.com/jxnblk/styled-system), groups of props are automatically applied to each component. Most components get the `COMMON` set of props which give the component access to color and space props (margin, padding, color and background color). These groups correspond to the `color` and `space` functions from `styled-system` which can be referenced in the styled system [table of style functions](https://github.com/jxnblk/styled-system/blob/master/docs/table.md#core).\n\nTo check which system props each component includes, check the documentation for that component.\n\n### The `as` prop\nAll Primer React components have access to the `as` prop, provided by [styled-components](https://www.styled-components.com/docs/api#as-polymorphic-prop). We use the `as` prop to render a component with the styles of the passed component in `as`, but with the system props of the base component.\n\nFor example, if you wanted to add some flex utilities to the `Text` component, you could do:\n\n```jsx live\n<Text as={Flex}>Hello!</Text>\n```\n\n\n### System Prop Categories\n\n| Category | Included Props | styled-system docs |\n|-----|--------|--------|\n| `COMMON`| <PropsList systemProps={COMMON}/>| [styled-system core docs](https://github.com/jxnblk/styled-system/blob/master/docs/table.md#core) |\n| `TYPOGRAPHY`| <PropsList systemProps={TYPOGRAPHY}/> |" +"---\nlang: PT-BR\ntitle: Sum\u00e1rio #6 Que Significa Que Voc\u00ea Foi Bem Longe\nanswer: \\{\\}\nclass: stretcher chapmark\nok: Ok, \u00e9 um Hash vazio\nerror: \n---\n\nVoc\u00ea \u00e9 um cl\u00e9rigo Ruby level 6. Quero dizer, que grande trabalho voc\u00ea fez. Vamos revisar:\n\n### Dados\nVoc\u00ea carregou alguns dados da internet,\n\n### Iterando\nVoc\u00ea iterou todos os elementos de um hash e voc\u00ea encadeou alguns outros m\u00e9todos.\n\n### Imprimindo Bonito\nE como se isso n\u00e3o fosse o bastante, voc\u00ea formatou e imprimiu alguns valores de uma forma\nque \u00e9 f\u00e1cil para humanos ler. De fato, __voc\u00ea fez um programa real!__\n\n### IF\nVoc\u00ea aprendeu a tomar o controle dos seus programas com declara\u00e7\u00f5es de __if__ e __else__.\n\n## Ent\u00e3o\nO que \u00e9 poss\u00edvel fazer em seguida? O que \u00e9 poss\u00edvel que voc\u00ea ainda tenha que aprender agora?\nHa! Esta \u00e9 a melhor parte. Voc\u00ea percorreu um caminho t\u00e3o grande que agora vamos revelar as classes.\nApenas mais duas li\u00e7\u00f5es curtas, e acabou.\n\nMais cedo, n\u00f3s criamos um Hash desta forma:\n\n Hash.new" +" 0\nSECTION\n 2\nHEADER\n 9\n$ACADVER\n 1\nAC1009\n 9\n$MEASUREMENT\n 70\n 1\n 0\nENDSEC\n 0\nSECTION\n 2\nENTITIES\n 0\nLWPOLYLINE\n 8\n0\n 62\n7\n 90\n5\n 70\n1\n 10\n14.271562\n 20\n2.189987\n 10\n14.271562\n 20\n1.441550\n 10\n15.568437\n 20\n1.441550\n 10\n15.568437\n 20\n2.938425\n 10\n14.271562\n 20\n2.938425\n 0\nLWPOLYLINE\n 8\n0\n 62\n7\n 90\n5\n 70\n1\n 10\n12.371562\n 20\n2.189987\n 10\n12.371562\n 20\n1.441550\n 10\n13.668437\n 20\n1.441550\n 10\n13.668437\n 20\n2.938425\n 10\n12.371562\n 20\n2.938425\n 0\nLWPOLYLINE\n 8\n0\n 62\n7\n 90\n5\n 70\n1\n 10\n14.271562\n 20\n4.729988\n 10\n14.271562\n 20\n3.981550\n 10\n15.568437\n 20\n3.981550\n 10\n15.568437\n 20\n5.478425\n 10\n14.271562\n 20\n5.478425\n 0\nLWPOLYLINE\n 8\n0\n 62\n7\n 90\n5\n 70\n1\n 10\n12.371562\n 20\n4.729988\n 10\n12.371562\n 20\n3.981550\n 10\n13.668437\n 20\n3.981550\n 10\n13.668437\n 20\n5.478425\n 10\n12.371562\n 20\n5.478425\n 0\nLWPOLYLINE\n 8\n0\n 62\n7\n 90\n5\n 70\n1\n 10\n31.788437\n 20\n9.809988\n 10\n31.788437\n 20\n10.208425\n 10\n30.441562\n 20\n10.208425\n 10\n30.441562\n 20\n9.411550\n 10\n31.788437\n 20\n9.411550\n 0\nLWPOLYLINE\n 8\n0\n 62\n7\n 90\n5\n 70\n1\n 10\n29.248437\n 20\n9.809988\n 10\n29.248437\n 20\n10.208425\n 10\n27.901563\n 20\n10.208425\n 10\n27.901563\n 20\n9.411550\n 10\n29.248437\n 20\n9.411550\n 0\nLWPOLYLINE" +"35\n15\n11\n13\n17\n11\n9\n3\n3\n5\n11\n15\n7\n3\n9\n7\n9\n49\n7\n7\n5\n3\n9\n21\n11\n5\n19\n11\n11\n13\n7\n23\n21\n13\n7\n5\n17\n0\n7\n3\n17\n15\n5\n7\n3\n3\n17\n21\n21\n25\n15\n3\n3\n13\n15\n5\n9\n9\n11\n9\n19\n27\n7\n13\n19\n9\n7\n7\n7\n3\n5\n3\n5\n31\n3\n21\n11\n21\n13\n7\n2\n7\n13\n5\n17\n7\n5\n23\n2\n13\n7\n21\n11\n5\n7\n9\n7\n11\n23\n7\n7\n9\n5\n11\n11\n7\n17\n15\n13\n7\n15\n5\n3\n19\n15\n7\n11\n11\n37\n7\n5\n5\n41\n9\n5\n11\n9\n17\n7\n11\n5\n19\n13\n15\n37\n9\n15\n27\n27\n7\n7\n9\n5\n5\n21\n7\n15\n13\n23\n21\n21\n2\n2\n21\n5\n0\n7\n27\n5\n3\n15\n21\n9\n5\n19\n7\n3\n11\n13\n9\n5\n25\n11\n13\n7\n11\n13\n28\n19\n5\n5\n11\n9\n31\n19\n5\n15\n19\n11\n21\n9\n3\n15\n3\n17\n9\n3\n27\n17\n3" +"import networkx as nx\nimport random\nimport time\nfrom networkx.classes.function import is_directed\n\nfrom networkx.algorithms.isomorphism.tree_isomorphism import (\n rooted_tree_isomorphism,\n tree_isomorphism,\n)\n\n\n# have this work for graph\n# given two trees (either the directed or undirected)\n# transform t2 according to the isomorphism\n# and confirm it is identical to t1\n# randomize the order of the edges when constructing\ndef check_isomorphism(t1, t2, isomorphism):\n\n # get the name of t1, given the name in t2\n mapping = {v2: v1 for (v1, v2) in isomorphism}\n\n # these should be the same\n d1 = is_directed(t1)\n d2 = is_directed(t2)\n assert d1 == d2\n\n edges_1 = []\n for (u, v) in t1.edges():\n if d1:\n edges_1.append((u, v))\n else:\n # if not directed, then need to\n # put the edge in a consistent direction\n if u < v:\n edges_1.append((u, v))\n else:\n edges_1.append((v, u))\n\n edges_2 = []\n for (u, v) in t2.edges():\n # translate to names for t1\n u = mapping[u]\n v = mapping[v]\n if d2:\n edges_2.append((u, v))\n else:\n if u < v:\n edges_2.append((u, v))\n else:\n edges_2.append((v, u))\n\n return sorted(edges_1) == sorted(edges_2)\n\n\ndef test_hardcoded():\n\n print(\"hardcoded test\")\n\n # define a test problem\n edges_1 = [\n (\"a\", \"b\"),\n (\"a\", \"c\"),\n (\"a\", \"d\"),\n (\"b\", \"e\"),\n (\"b\", \"f\"),\n (\"e\", \"j\"),\n (\"e\", \"k\"),\n (\"c\"," +"#\n# Obsolete! New projects should use .azure/templates/jobs/default-build.yml instead\n#\n# TODO: remove this once templated projects have referenced the new location.\n#\n# default-build.yml\n# Description: Defines a build phase for invoking build.sh/cmd\n# Parameters:\n# phaseName: string\n# The name of the phase. Defaults to the name of the OS.\n# queueName: string\n# The name of the VSTS agent queue to use.\n# agentOs: string\n# Used in templates to define variables which are OS specific. Typically from the set { Windows, Linux, macOS }\n# buildArgs: string\n# Additional arguments to pass to the build.sh/cmd script.\n# Note: -ci is always passed\n# beforeBuild: [steps]\n# Additional steps to run before build.sh/cmd\n# afterBuild: [steps]\n# Additional steps to run after build.sh/cmd\n# artifacts:\n# publish: boolean\n# Should artifacts be published\n# path: string\n# The file path to artifacts output\n# name: string\n# The name of the artifact container\n# variables: { string: string }\n# A map of custom variables\n# matrix: { string: { string: string } }\n# A map of matrix configurations and variables. https://docs.microsoft.com/en-us/vsts/pipelines/yaml-schema?view=vsts#matrix\n# demands: string | [ string ]\n# A list of agent demands. https://docs.microsoft.com/en-us/vsts/pipelines/yaml-schema?view=vsts#demands" +"Most dialogs have a help button that takes you to the relevant online documentation\r\nYou can use \"Ctrl + [\" or \"Ctrl + ]\" to indent/outdent a block of code in Coder view\r\nBoth in the Coder and Builder views have a demos menu with plenty of demos. There are more on Pavlovia.org\r\nFrom Builder you can use \"Compile\" to generate the Python script that controls your experiment, and view or edit it in the Coder. (Any edits are not reflected back in the Builder, however.)\r\nYou should avoid snorting Pepsi\r\nMost buttons have helpful usage tips if you hover over them\r\nPsychoPy can handle many different units (degrees of visual angle, cm...) for your stimuli. But you need to tell it about your monitor first (see the online documentation on General > Units)\r\nMenu items show you the current key bindings (which can be configured in preferences)\r\nFor brief stimuli, you should use frames rather than time to set the start/stop of your stimuli. Most monitor frame rates are precise to the microsecond!\r\nIt's a really good idea to check your data are all being saved as you like BEFORE running all of your participants!\r\nBuilder: right-clicking on any" +"# Sharing GPU Resources\n\nOften times, teams are running big ML models on instances with GPU resources.\n\nGPU instances are expensive! You want to make sure that you're utilizing the GPUs you're paying for!\n\n## Without configuration\n\n[To deploy a pipeline that relies on GPU](gpus.md), you'll already have set the `gpu` resource requirement in the pipeline specification. But Pachyderm workers by default are long lived ... the worker is spun up and waits for new input. That works great for pipelines that are processing a lot of new incoming commits.\n\nFor ML workflows, especially during the development cycle, you probably will see lower volume of input commits. Which means that you could have your pipeline workers 'taking' the GPU resource as far as k8s is concerned, but 'idling' as far as you're concerned.\n\nLet's use an example.\n\nLet's say your cluster has a single GPU node with 2 GPUs. Let's say you have a pipeline running that requires 1 GPU. You've trained some models, and found the results were surprising. You suspect your feature extraction code, and are delving into debugging that stage of your pipeline. Meanwhile, the worker you've spun up for your GPU training job is sitting idle,"