content
stringlengths
7
2.61M
class _Router: """Matches sample props to destinations according to a routing policy Parameters ---------- routing : dict {dest: props} for dest str, props {str, list, dict} User-defined routing policy. Maps each destination string to the properties that should be provided to that destination. Props may be: - '*': provide all properties - a list of property names to include - a list of property names, each prefixed by '-', to exclude; all others will be provided - a single name to include or exclude - a dict mapping each target property name to its source property name dests : list of {str, iterable of str} The ordered destinations for this router. If a set of strings is provided for each entry, any of these strings will route parameters to the destination. Usually this is fixed for a metaestimator. Notes ----- Abstracting away destination names/aliases in this way allows for providing syntactic sugar to users (e.g. Pipeline can declare '*' or 'steps' as an alias for "provide these props to ``fit`` of all steps). While this may be instead facilitated by some string-based pattern matching, the present approach is more explicit and ensures backwards compatibility can be maintained. """ def __init__(self, routing, dests): # can immediately: # * check that all routes have valid dests # * consolidate routing across aliases, such that each must be # either a blacklist of length 0 or more or a mapping alias_to_idx = defaultdict(list) for i, aliases in enumerate(dests): if isinstance(aliases, six.string_types): aliases = [aliases] for alias in aliases: alias_to_idx[alias].append(i) alias_to_idx.default_factory = None policies = [None] * len(dests) # Sorted so error messages are deterministic for dest, props in sorted(routing.items()): if props == '*': policy = _AllPolicy() else: if isinstance(props, six.string_types): props = [props] if isinstance(props, dict): policy = _IncludePolicy(props) else: minuses = [prop[:1] == '-' for prop in props] if all(minuses): policy = _ExcludePolicy({prop[1:] for prop in props}) elif any(minuses): raise ValueError('Routing props should either all ' 'start with "-" or none should start ' 'with "-". Got a mix for %r' % dest) else: policy = _IncludePolicy({prop: prop for prop in props}) # raises KeyError if unknown dest for idx in alias_to_idx[dest]: if policies[idx] is None: policies[idx] = copy.deepcopy(policy) else: if type(policies[idx]) is not type(policy): raise ValueError('When handling routing for ' 'destination {!r}, found a mix of ' 'inclusion, exclusion and pass all ' 'policies.'.format(dest)) policies[idx].update(policy) self.policies = [_NonePolicy() if policy is None else policy for policy in policies] def __call__(self, props): """Apply the routing policy to the given sample props Parameters ---------- props : dict Returns ------- dest_props : list of dicts Props to be passed to each destination declared in the constructor. unused : set of str Names of props that were not routed anywhere. """ unused = set(props) out = [policy.apply(props, unused) for policy in self.policies] return out, unused
// This is a generated file. Not intended for manual editing. package io.github.sof3.libglocal.intellij.psi; import java.util.List; import org.jetbrains.annotations.*; import com.intellij.psi.PsiElement; public interface LgcArithmeticPredicate extends PsiElement { @NotNull LgcMathComparator getMathComparator(); @Nullable PsiElement getMathMod(); @NotNull PsiElement getMathSeparator(); }
def iterate(func: Callable[[Any], Any], *, start: Any) -> Iterator: x = start while True: yield x x = func(x)
Coagulation autolysis in microorganisms and its relation to coagulase production The phenomenon of coagulation autolysis was observed in two model microorganisms, i.e., a bacterial culture and an imperfect fungus. It was characterized by impairment of the cell membranes, followed by condensation and dehydration of the cytoplasm and longterm preservation of the cells in the form of coagulated cytoplasm. In this respect, it was similar to coagulation necrosis of human tissues. The autolysis in the microorganisms was accompanied by increase of their coagulase activity, the substrate specificity of the enzyme rather broad. The coagulase activity of the microorganisms was detected during the culture period between the lagphase and the exponential growth phase, i.e., the phase of their active growth. It served as a signal to induce biosynthesis of peptidohydrolase and cleavage of proteins. We believe that the phenomenon of coagulation autolysis in these microorganisms is rather typical and can be considered as an adaptative reaction, inducing a cascade of events from synthesis of coagulase to overproduction of peptidohydrolases with proteolytic activity.
<reponame>david-Ken/GildedRose-Java-Kata package com.gildedrose; import static org.junit.Assert.*; import org.junit.Test; public class MyGildedRoseTest { @Test public void testGildedRose() { // fail("Not yet implemented"); } @Test public void testUpdateQuality() { Item[] item = new Item[] { new Item("+5 Dexterity Vest", 10, 20) }; GildedRose app = new GildedRose(item); app.updateQuality(); /**** TEST SUR UN PRODUIT NORMALE ***********/ // sellIn et qualité positif assertTrue("diminution normale",item[0].sellIn == 9 && item[0].quality == 19); /**** * pour tester si les deux valeur diminuent ****/ // sellIn positf qualité 0 item[0].quality = 0; app.updateQuality(); assertFalse("dimininution produit de qualité 0 ", item[0].quality < 0); /**** * pour tester que la qualité ne va pas en dessous de 0 ****/ // sellIn négatif qualité positfi item[0].quality = 20; item[0].sellIn = -10; app.updateQuality(); assertFalse("dimininution produit de qualité 0 ", item[0].quality < 0); /**** * pour tester que le qualité diminue même si le sellIn est déja négatif ****/ //sellIn et qualité négatif item[0].quality = -10; item[0].sellIn = -10; app.updateQuality(); assertTrue("dimininution produit de qualité 0 ", item[0].quality == -10 && item[0].sellIn == -11); /******** TEST SUR DU SULFURAS ***************/ item[0].name = "Sulfuras, Hand of Ragnaros"; item[0].quality = 20; item[0].sellIn = 10; app.updateQuality(); assertTrue("Sulfuras ", item[0].quality == 20 && item[0].sellIn == 10); item[0].name = "Sulfuras, Hand of Ragnaros"; item[0].quality = 20; item[0].sellIn = -10; app.updateQuality(); assertTrue("Sulfuras ", item[0].quality == 20 && item[0].sellIn == -10); /*** les valeurs du Sulfuras ne changent pas ***/ /************** TEST SUR DU AGED BRIE **************/ // sellIn < 6 qualite < 50 item[0].name = "<NAME>"; item[0].quality = 20; item[0].sellIn = 3; app.updateQuality(); assertTrue("Aged Brie mois 6 de sellIn", item[0].quality == 21 && item[0].sellIn == 2); // sellIn < 11 qualite < 50 item[0].name = "<NAME>"; item[0].quality = 20; item[0].sellIn = 8; app.updateQuality(); assertTrue("Aged Brie mois 6 de sellIn", item[0].quality == 21 && item[0].sellIn == 7); // sellIn > 11 qualite < 50 item[0].name = "<NAME>"; item[0].quality = 20; item[0].sellIn = 15; app.updateQuality(); assertTrue("Aged Brie mois 6 de sellIn", item[0].quality == 21 && item[0].sellIn == 14); // sellIn negatif qualite < 60 item[0].name = "<NAME>"; item[0].quality = 20; item[0].sellIn = 0; app.updateQuality(); assertTrue("Aged Brie mois 6 de sellIn", item[0].quality == 22 && item[0].sellIn == -1); // sellIn negatif qualite < 60 item[0].name = "<NAME>"; item[0].quality = 20; item[0].sellIn = -10; app.updateQuality(); assertTrue("Aged Brie mois 6 de sellIn", item[0].quality == 22 && item[0].sellIn == -11); // sellIn negatif qualite < 60 item[0].name = "<NAME>"; item[0].quality = 60; item[0].sellIn = -10; app.updateQuality(); assertTrue("Aged Brie mois 6 de sellIn", item[0].quality == 60 && item[0].sellIn == -11); /**************** TEST SUR DU BACKSATAGE *************/ item[0].name = "Backstage passes to a TAFKAL80ETC concert"; item[0].quality = 50; app.updateQuality(); assertFalse("Qaualité égale à 50 ", item[0].quality > 50); // sellIn < 6 qualité < 50 item[0].name = "Backstage passes to a TAFKAL80ETC concert"; item[0].quality = 20; item[0].sellIn = 2; app.updateQuality(); assertTrue("Backstage", item[0].quality == 23 && item[0].sellIn == 1); // sellIn < 11 qualite < 50 item[0].name = "Backstage passes to a TAFKAL80ETC concert"; item[0].quality = 20; item[0].sellIn = 8; app.updateQuality(); assertTrue("Backstage 6 de sellIn", item[0].quality == 22 && item[0].sellIn == 7); // sellIn > 11 qualite < 60 item[0].name = "Backstage passes to a TAFKAL80ETC concert"; item[0].quality = 20; item[0].sellIn = 15; app.updateQuality(); assertTrue("Backstage passes to a TAFKAL80ETC concert mois 6 de sellIn", item[0].quality == 21 && item[0].sellIn == 14); item[0].quality = 60; item[0].sellIn = 15; app.updateQuality(); assertTrue("Backstage depassant 50 de qualité", item[0].quality== 60 && item[0].sellIn == 14); item[0].quality = 49; item[0].sellIn = 8; app.updateQuality(); assertTrue("Backstage depassant 50 de qualité sellIn < 11", item[0].quality== 50 && item[0].sellIn == 7); item[0].quality = 49; item[0].sellIn = 3; app.updateQuality(); assertTrue("Backstage depassant 50 de qualité sellIn < 11", item[0].quality== 50 && item[0].sellIn == 2); // sellIn == 0 qualite < 50 item[0].name = "Backstage passes to a TAFKAL80ETC concert"; item[0].quality = 20; item[0].sellIn = 0; app.updateQuality(); assertTrue("Backstage passes to a TAFKAL80ETC concert mois 6 de sellIn", item[0].quality == 0 && item[0].sellIn == -1); // sellIn > 11 qualite < 60 item[0].name = "Backstage passes to a TAFKAL80ETC concert"; item[0].quality = 20; item[0].sellIn = -15; app.updateQuality(); assertTrue("Backstage passes to a TAFKAL80ETC concert mois 6 de sellIn", item[0].quality == 0); } }
<filename>hadoop-tools/hadoop-aws/src/test/java/org/apache/hadoop/fs/contract/s3a/ITestS3AContractMultipartUploader.java begin_unit|revision:0.9.5;language:Java;cregit-version:0.0.1 begin_comment comment|/* * Licensed to the Apache Software Foundation (ASF) under one * or more contributor license agreements. See the NOTICE file * distributed with this work for additional information * regarding copyright ownership. The ASF licenses this file * to you under the Apache License, Version 2.0 (the * "License"); you may not use this file except in compliance * with the License. You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * See the License for the specific language governing permissions and * limitations under the License. */ end_comment begin_package DECL|package|org.apache.hadoop.fs.contract.s3a package|package name|org operator|. name|apache operator|. name|hadoop operator|. name|fs operator|. name|contract operator|. name|s3a package|; end_package begin_import import|import name|org operator|. name|slf4j operator|. name|Logger import|; end_import begin_import import|import name|org operator|. name|slf4j operator|. name|LoggerFactory import|; end_import begin_import import|import name|org operator|. name|apache operator|. name|hadoop operator|. name|conf operator|. name|Configuration import|; end_import begin_import import|import name|org operator|. name|apache operator|. name|hadoop operator|. name|fs operator|. name|Path import|; end_import begin_import import|import name|org operator|. name|apache operator|. name|hadoop operator|. name|fs operator|. name|contract operator|. name|AbstractFSContract import|; end_import begin_import import|import name|org operator|. name|apache operator|. name|hadoop operator|. name|fs operator|. name|contract operator|. name|AbstractContractMultipartUploaderTest import|; end_import begin_import import|import name|org operator|. name|apache operator|. name|hadoop operator|. name|fs operator|. name|contract operator|. name|ContractTestUtils import|; end_import begin_import import|import name|org operator|. name|apache operator|. name|hadoop operator|. name|fs operator|. name|s3a operator|. name|S3AFileSystem import|; end_import begin_import import|import name|org operator|. name|apache operator|. name|hadoop operator|. name|fs operator|. name|s3a operator|. name|WriteOperationHelper import|; end_import begin_import import|import static name|org operator|. name|apache operator|. name|hadoop operator|. name|fs operator|. name|s3a operator|. name|S3ATestConstants operator|. name|* import|; end_import begin_import import|import static name|org operator|. name|apache operator|. name|hadoop operator|. name|fs operator|. name|s3a operator|. name|S3ATestUtils operator|. name|* import|; end_import begin_import import|import static name|org operator|. name|apache operator|. name|hadoop operator|. name|fs operator|. name|s3a operator|. name|scale operator|. name|AbstractSTestS3AHugeFiles operator|. name|DEFAULT_HUGE_PARTITION_SIZE import|; end_import begin_comment comment|/** * Test MultipartUploader with S3A. * Although not an S3A Scale test subclass, it uses the -Dscale option * to enable it, and partition size option to control the size of * parts uploaded. */ end_comment begin_class DECL|class|ITestS3AContractMultipartUploader specifier|public class|class name|ITestS3AContractMultipartUploader extends|extends name|AbstractContractMultipartUploaderTest block|{ DECL|field|LOG specifier|private specifier|static specifier|final name|Logger name|LOG init|= name|LoggerFactory operator|. name|getLogger argument_list|( name|ITestS3AContractMultipartUploader operator|. name|class argument_list|) decl_stmt|; DECL|field|partitionSize specifier|private name|int name|partitionSize decl_stmt|; comment|/** * S3 requires a minimum part size of 5MB (except the last part). * @return 5MB */ annotation|@ name|Override DECL|method|partSizeInBytes () specifier|protected name|int name|partSizeInBytes parameter_list|() block|{ return|return name|partitionSize return|; block|} annotation|@ name|Override DECL|method|getTestPayloadCount () specifier|protected name|int name|getTestPayloadCount parameter_list|() block|{ return|return literal|3 return|; block|} annotation|@ name|Override DECL|method|getFileSystem () specifier|public name|S3AFileSystem name|getFileSystem parameter_list|() block|{ return|return operator|( name|S3AFileSystem operator|) name|super operator|. name|getFileSystem argument_list|() return|; block|} comment|/** * Create a configuration, possibly patching in S3Guard options. * @return a configuration */ annotation|@ name|Override DECL|method|createConfiguration () specifier|protected name|Configuration name|createConfiguration parameter_list|() block|{ name|Configuration name|conf init|= name|super operator|. name|createConfiguration argument_list|() decl_stmt|; name|maybeEnableS3Guard argument_list|( name|conf argument_list|) expr_stmt|; return|return name|conf return|; block|} annotation|@ name|Override DECL|method|createContract (Configuration conf) specifier|protected name|AbstractFSContract name|createContract parameter_list|( name|Configuration name|conf parameter_list|) block|{ return|return operator|new name|S3AContract argument_list|( name|conf argument_list|) return|; block|} comment|/** * Bigger test: use the scale timeout. * @return the timeout for scale tests. */ annotation|@ name|Override DECL|method|getTestTimeoutMillis () specifier|protected name|int name|getTestTimeoutMillis parameter_list|() block|{ return|return name|SCALE_TEST_TIMEOUT_MILLIS return|; block|} annotation|@ name|Override DECL|method|supportsConcurrentUploadsToSamePath () specifier|protected name|boolean name|supportsConcurrentUploadsToSamePath parameter_list|() block|{ return|return literal|true return|; block|} comment|/** * Provide a pessimistic time to become consistent. * @return a time in milliseconds */ annotation|@ name|Override DECL|method|timeToBecomeConsistentMillis () specifier|protected name|int name|timeToBecomeConsistentMillis parameter_list|() block|{ return|return literal|30 operator|* literal|1000 return|; block|} annotation|@ name|Override DECL|method|finalizeConsumesUploadIdImmediately () specifier|protected name|boolean name|finalizeConsumesUploadIdImmediately parameter_list|() block|{ return|return literal|false return|; block|} annotation|@ name|Override DECL|method|setup () specifier|public name|void name|setup parameter_list|() throws|throws name|Exception block|{ name|super operator|. name|setup argument_list|() expr_stmt|; name|Configuration name|conf init|= name|getContract argument_list|() operator|. name|getConf argument_list|() decl_stmt|; name|boolean name|enabled init|= name|getTestPropertyBool argument_list|( name|conf argument_list|, name|KEY_SCALE_TESTS_ENABLED argument_list|, name|DEFAULT_SCALE_TESTS_ENABLED argument_list|) decl_stmt|; name|assume argument_list|( literal|"Scale test disabled: to enable set property " operator|+ name|KEY_SCALE_TESTS_ENABLED argument_list|, name|enabled argument_list|) expr_stmt|; name|partitionSize operator|= operator|( name|int operator|) name|getTestPropertyBytes argument_list|( name|conf argument_list|, name|KEY_HUGE_PARTITION_SIZE argument_list|, name|DEFAULT_HUGE_PARTITION_SIZE argument_list|) expr_stmt|; block|} comment|/** * Extend superclass teardown with actions to help clean up the S3 store, * including aborting uploads under the test path. */ annotation|@ name|Override DECL|method|teardown () specifier|public name|void name|teardown parameter_list|() throws|throws name|Exception block|{ name|Path name|teardown init|= name|path argument_list|( literal|"teardown" argument_list|) operator|. name|getParent argument_list|() decl_stmt|; name|S3AFileSystem name|fs init|= name|getFileSystem argument_list|() decl_stmt|; if|if condition|( name|fs operator|!= literal|null condition|) block|{ name|WriteOperationHelper name|helper init|= name|fs operator|. name|getWriteOperationHelper argument_list|() decl_stmt|; try|try block|{ name|LOG operator|. name|info argument_list|( literal|"Teardown: aborting outstanding uploads under {}" argument_list|, name|teardown argument_list|) expr_stmt|; name|int name|count init|= name|helper operator|. name|abortMultipartUploadsUnderPath argument_list|( name|fs operator|. name|pathToKey argument_list|( name|teardown argument_list|) argument_list|) decl_stmt|; name|LOG operator|. name|info argument_list|( literal|"Found {} incomplete uploads" argument_list|, name|count argument_list|) expr_stmt|; block|} catch|catch parameter_list|( name|Exception name|e parameter_list|) block|{ name|LOG operator|. name|warn argument_list|( literal|"Exeception in teardown" argument_list|, name|e argument_list|) expr_stmt|; block|} block|} name|super operator|. name|teardown argument_list|() expr_stmt|; block|} comment|/** * S3 has no concept of directories, so this test does not apply. */ DECL|method|testDirectoryInTheWay () specifier|public name|void name|testDirectoryInTheWay parameter_list|() throws|throws name|Exception block|{ comment|// no-op block|} annotation|@ name|Override DECL|method|testMultipartUploadReverseOrder () specifier|public name|void name|testMultipartUploadReverseOrder parameter_list|() throws|throws name|Exception block|{ name|ContractTestUtils operator|. name|skip argument_list|( literal|"skipped for speed" argument_list|) expr_stmt|; block|} block|} end_class end_unit
<gh_stars>1-10 #include "xbaseband.h" #include "csr_control.h" #include "ringbus.h" #include "ringbus2_pre.h" #include "ringbus2_post.h" #include "stall2.h" unsigned got = 0; unsigned correct = 0; void rbcb(const unsigned int data) { SET_REG(x3, 0x0); // SET_REG(x3, 0xffffffff); SET_REG(x3, data); const unsigned expected = 0xC001 + got; if( data == expected) { correct++; } got++; } int main(void) { int sent_results = 0; Ringbus ringbus; ring_register_callback(&rbcb, EDGE_EDGE_IN); stall2(1550); while(1) { STALL(1000); STALL(1000); STALL(1000); check_ring(&ringbus); if( correct >= 16 && !sent_results ) { ring_block_send_eth(EDGE_EDGE_OUT | 1); sent_results = 1; } } }
package mem import ( "testing" "time" "github.com/influxdata/telegraf" "github.com/influxdata/telegraf/plugins/inputs/system" "github.com/influxdata/telegraf/testutil" "github.com/shirou/gopsutil/v3/mem" "github.com/stretchr/testify/require" ) func TestMemStats(t *testing.T) { var mps system.MockPS var err error defer mps.AssertExpectations(t) var acc testutil.Accumulator vms := &mem.VirtualMemoryStat{ Total: 12400, Available: 7600, Used: 5000, Free: 1235, Active: 8134, Inactive: 1124, Slab: 1234, Wired: 134, // Buffers: 771, // Cached: 4312, // Shared: 2142, CommitLimit: 1, CommittedAS: 118680, Dirty: 4, HighFree: 0, HighTotal: 0, HugePageSize: 4096, HugePagesFree: 0, HugePagesTotal: 0, LowFree: 69936, LowTotal: 255908, Mapped: 42236, PageTables: 1236, Shared: 0, Sreclaimable: 1923022848, Sunreclaim: 157728768, SwapCached: 0, SwapFree: 524280, SwapTotal: 524280, VmallocChunk: 3872908, VmallocTotal: 3874808, VmallocUsed: 1416, WriteBack: 0, WriteBackTmp: 0, } mps.On("VMStat").Return(vms, nil) plugin := &MemStats{ps: &mps} err = plugin.Init() require.NoError(t, err) plugin.platform = "linux" require.NoError(t, err) err = plugin.Gather(&acc) require.NoError(t, err) expected := []telegraf.Metric{ testutil.MustMetric( "mem", map[string]string{}, map[string]interface{}{ "total": uint64(12400), "available": uint64(7600), "used": uint64(5000), "available_percent": float64(7600) / float64(12400) * 100, "used_percent": float64(5000) / float64(12400) * 100, "free": uint64(1235), "cached": uint64(0), "buffered": uint64(0), "active": uint64(8134), "inactive": uint64(1124), // "wired": uint64(134), "slab": uint64(1234), "commit_limit": uint64(1), "committed_as": uint64(118680), "dirty": uint64(4), "high_free": uint64(0), "high_total": uint64(0), "huge_page_size": uint64(4096), "huge_pages_free": uint64(0), "huge_pages_total": uint64(0), "low_free": uint64(69936), "low_total": uint64(255908), "mapped": uint64(42236), "page_tables": uint64(1236), "shared": uint64(0), "sreclaimable": uint64(1923022848), "sunreclaim": uint64(157728768), "swap_cached": uint64(0), "swap_free": uint64(524280), "swap_total": uint64(524280), "vmalloc_chunk": uint64(3872908), "vmalloc_total": uint64(3874808), "vmalloc_used": uint64(1416), "write_back": uint64(0), "write_back_tmp": uint64(0), }, time.Unix(0, 0), telegraf.Gauge, ), } testutil.RequireMetricsEqual(t, expected, acc.GetTelegrafMetrics(), testutil.IgnoreTime()) }
def remove_path(input_path, depth=-1): try: if depth > 0: depth = 0 - depth elif depth == 0: depth = -1 levels = [] while depth <= -1: levels.append(input_path.split(os.sep)[depth]) depth += 1 return os.path.join(*levels) except IndexError: return remove_path(input_path, depth=depth+1)
The effects of diet and physiological stress on the evolutionary dynamics of an enzyme polymorphism In the northern acorn barnacle Semibalanus balanoides, polymorphism at the mannose6phosphate isomerase (Mpi) locus appears to be maintained by distinct selection regimes that vary between intertidal microhabitats. The goal of the present experiment was to elucidate the mechanism of selection at the Mpi locus by examining the relationship between genotype and fitnessrelated lifehistory traits in laboratory manipulations. When barnacles were cultured on a mannosesupplemented diet and exposed to thermal stress, different Mpi genotypes exhibited differences in the rate of growth that predicted survivorship. In contrast, no such relationship was observed in control or fructosesupplemented dietary treatments either in the presence or in the absence of stress. Similarly, the phenotype and survivorship of genotypes at another allozyme locus and a presumably neutral mitochondrial DNA marker were homogeneous across all treatments and unaffected by experimental manipulations. These results suggest that the differential survivorship of Mpi genotypes in the field and laboratory results from a differential ability to process mannose6phosphate through glycolysis. The widespread polymorphism at Mpi observed in marine taxa may reflect the interaction between dietary composition and environmental heterogeneity in intertidal habitats.
<reponame>phshaikh/fboss<gh_stars>1-10 /* * Copyright (c) 2004-present, Facebook, Inc. * All rights reserved. * * This source code is licensed under the BSD-style license found in the * LICENSE file in the root directory of this source tree. An additional grant * of patent rights can be found in the PATENTS file in the same directory. * */ #include "fboss/agent/hw/sai/switch/gen-cpp2/SaiCtrlAsyncClient.h" #include <folly/Exception.h> #include <folly/FileUtil.h> #include <folly/SocketAddress.h> #include <folly/init/Init.h> #include <folly/io/async/EventBase.h> #include <folly/io/async/EventBaseManager.h> #include <thrift/lib/cpp/async/TAsyncSocket.h> #include <thrift/lib/cpp2/async/HeaderClientChannel.h> #include <thrift/lib/cpp2/async/RocketClientChannel.h> #include <iostream> #include <thread> namespace { std::unique_ptr<facebook::fboss::SaiCtrlAsyncClient> getStreamingClient( folly::EventBase* evb, const folly::IPAddress& ip) { folly::SocketAddress addr{ip, 5909}; return std::make_unique<facebook::fboss::SaiCtrlAsyncClient>( apache::thrift::RocketClientChannel::newChannel( apache::thrift::async::TAsyncSocket::UniquePtr( new apache::thrift::async::TAsyncSocket(evb, addr)))); } void subscribeToDiagShell(folly::EventBase* evb, const folly::IPAddress& ip) { auto client = getStreamingClient(evb, ip); auto responseAndStream = client->sync_startDiagShell(); folly::writeFull( STDOUT_FILENO, responseAndStream.response.c_str(), responseAndStream.response.size()); auto streamFuture = std::move(responseAndStream.stream) .subscribeExTry( evb, [evb](auto&& t) { if (t.hasValue()) { const auto& shellOut = t.value(); folly::writeFull( STDOUT_FILENO, shellOut.c_str(), shellOut.size()); // Typically, neither the error, nor completed events will // occur, we expect most streams to end when the client is // terminated by the user. } else if (t.hasException()) { auto msg = folly::exceptionStr(std::move(t.exception())); // N.B., can't use folly logging, because it messes with the // terminal mode such that backspace, ctrl+D, etc.. don't seem // to work anymore. std::cout << "error in stream: " << msg << "\n"; evb->terminateLoopSoon(); } else { std::cout << "stream completed!\n"; evb->terminateLoopSoon(); } }) .futureJoin(); evb->loop(); } void handleStdin(folly::EventBase* evb, const folly::IPAddress& ip) { auto client = getStreamingClient(evb, ip); ssize_t nread; constexpr ssize_t bufSize = 512; std::array<char, bufSize> buf; while (true) { if ((nread = ::read(STDIN_FILENO, buf.data(), bufSize)) < 0) { folly::throwSystemError("failed to read from stdin"); } else if (nread == 0) { break; } else { std::string input(buf.data(), nread); try { // TODO: fill in ClientInformation, or get rid of it in the API facebook::fboss::ClientInformation ci; client->sync_produceDiagShellInput(input, ci); } catch (const std::exception& e) { std::cout << "cli caught server exception " << e.what() << "\n"; } } } } } // namespace /* * Connect to the FBOSS agent SAI streaming thrift service, * * start a diag shell session to get a thrift stream; handle new stream data by * printing it. * * start a thread reading standard input and sending it to the FBOSS agent */ int main(int argc, char* argv[]) { folly::init(&argc, &argv, true); folly::EventBase streamEvb; folly::EventBase stdinEvb; std::thread streamT([&streamEvb]() { subscribeToDiagShell(&streamEvb, folly::IPAddress{"::1"}); }); std::thread readStdinT( [&stdinEvb]() { handleStdin(&stdinEvb, folly::IPAddress{"::1"}); }); readStdinT.join(); streamEvb.terminateLoopSoon(); streamT.join(); }
The hidden theology of Adam Smith This paper contests late readings of Adam Smith's invisible hand as an essentially secular device. It is argued that Smith's social and economic philosophy is inherently theological and that his entire model of social order is logically dependent on the notion of God's action in nature. It will be shown that far from being a purely secular, materialist or evolutionist approach Smith works from the argument from design to construct a model that is teleological and securely located in the chain of being tradition. His focus upon happiness as the Final Cause of nature renders improbable any claims for proto-evolutionism in his work while his arguments about the deliberate endowment of defects in the human frame make no sense without the supposition of design and purpose in nature.
Upregulation of IL-13 concentration in vivo by the IL13 variant associated with bronchial asthma. BACKGROUND A substantial body of evidence exists to support the pivotal role of IL-13 in the pathogenesis of bronchial asthma. We recently found that a variant of the IL13 gene (Arg110Gln) is genetically associated with bronchial asthma, which is concordant with animal experiments using IL-13 in the development of asthma. OBJECTIVE To address whether the Gln110 variant of IL13 influences IL-13 function, contributing to the pathogenesis of bronchial asthma, we studied the functional properties of the variant. METHODS We generated 2 types of recombinant IL-13 proteins, the amino acids of which at 110 were arginine or glutamine, and analyzed the binding affinities with the IL-13 receptors, as well as the stability of the proteins. We further compared the relationship between the genotype and serum levels of IL-13. RESULTS The variant showed a lower affinity with the IL-13 receptor alpha2 chain, a decoy receptor, causing less clearance. The variant also demonstrated an enhanced stability in both human and mouse plasma. We further identified that asthmatic patients homozygous for the Gln110 variant have higher serum levels of IL-13 than those without the variant. CONCLUSION These results suggested that the variant might act as a functional genetic factor of bronchial asthma with a unique mechanism to upregulate local and systemic IL-13 concentration in vivo.
<gh_stars>10-100 /* Copyright (C) 2012 Modelon AB This program is free software: you can redistribute it and/or modify it under the terms of the BSD style license. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the FMILIB_License.txt file for more details. You should have received a copy of the FMILIB_License.txt file along with this program. If not, contact Modelon AB <http://www.modelon.com>. */ #include <assert.h> #include <FMI2/fmi2_capi.h> #include <FMI2/fmi2_capi_impl.h> fmi2_status_t fmi2_capi_enter_event_mode(fmi2_capi_t* fmu) { assert(fmu); assert(fmu->c); jm_log_verbose(fmu->callbacks, FMI_CAPI_MODULE_NAME, "Calling fmi2EnterEventMode"); return fmu->fmi2EnterEventMode(fmu->c); } fmi2_status_t fmi2_capi_new_discrete_states(fmi2_capi_t* fmu, fmi2_event_info_t* eventInfo) { assert(fmu); assert(fmu->c); jm_log_verbose(fmu->callbacks, FMI_CAPI_MODULE_NAME, "Calling fmi2NewDiscreteStates"); return fmu->fmi2NewDiscreteStates(fmu->c, eventInfo); } fmi2_status_t fmi2_capi_enter_continuous_time_mode(fmi2_capi_t* fmu) { assert(fmu); assert(fmu->c); jm_log_verbose(fmu->callbacks, FMI_CAPI_MODULE_NAME, "Calling fmi2EnterContinuousTimeMode"); return fmu->fmi2EnterContinuousTimeMode(fmu->c); } fmi2_status_t fmi2_capi_set_time(fmi2_capi_t* fmu, fmi2_real_t time) { assert(fmu); jm_log_debug(fmu->callbacks, FMI_CAPI_MODULE_NAME, "Calling fmi2SetTime"); return fmu->fmi2SetTime(fmu->c, time); } fmi2_status_t fmi2_capi_set_continuous_states(fmi2_capi_t* fmu, const fmi2_real_t x[], size_t nx) { assert(fmu); jm_log_debug(fmu->callbacks, FMI_CAPI_MODULE_NAME, "Calling fmi2SetContinuousStates"); return fmu->fmi2SetContinuousStates(fmu->c, x, nx); } fmi2_status_t fmi2_capi_completed_integrator_step(fmi2_capi_t* fmu, fmi2_boolean_t noSetFMUStatePriorToCurrentPoint, fmi2_boolean_t* enterEventMode, fmi2_boolean_t* terminateSimulation) { assert(fmu); jm_log_debug(fmu->callbacks, FMI_CAPI_MODULE_NAME, "Calling fmi2CompletedIntegratorStep"); return fmu->fmi2CompletedIntegratorStep(fmu->c, noSetFMUStatePriorToCurrentPoint, enterEventMode, terminateSimulation); } fmi2_status_t fmi2_capi_get_derivatives(fmi2_capi_t* fmu, fmi2_real_t derivatives[], size_t nx) { assert(fmu); jm_log_debug(fmu->callbacks, FMI_CAPI_MODULE_NAME, "Calling fmi2GetDerivatives"); return fmu->fmi2GetDerivatives(fmu->c, derivatives, nx); } fmi2_status_t fmi2_capi_get_event_indicators(fmi2_capi_t* fmu, fmi2_real_t eventIndicators[], size_t ni) { assert(fmu); jm_log_debug(fmu->callbacks, FMI_CAPI_MODULE_NAME, "Calling fmi2GetEventIndicators"); return fmu->fmi2GetEventIndicators(fmu->c, eventIndicators, ni); } fmi2_status_t fmi2_capi_get_continuous_states(fmi2_capi_t* fmu, fmi2_real_t states[], size_t nx) { assert(fmu); jm_log_debug(fmu->callbacks, FMI_CAPI_MODULE_NAME, "Calling fmi2GetContinuousStates"); return fmu->fmi2GetContinuousStates(fmu->c, states, nx); } fmi2_status_t fmi2_capi_get_nominals_of_continuous_states(fmi2_capi_t* fmu, fmi2_real_t x_nominal[], size_t nx) { assert(fmu); jm_log_debug(fmu->callbacks, FMI_CAPI_MODULE_NAME, "Calling fmi2GetNominalsOfContinuousStates"); return fmu->fmi2GetNominalsOfContinuousStates(fmu->c, x_nominal, nx); }
<reponame>ayecue/greybel-interpreter import { ASTBase, ASTListConstructorExpression, ASTListValue } from 'greybel-core'; import { Expression } from '../types/expression'; import { Operation } from '../types/operation'; import CustomList from '../custom-types/list'; import { isCustomValue } from '../typer'; import { OperationContext } from '../context'; export class ExpressionSegment { values: any[]; constructor(values: any[]) { this.values = values; } } export default class ListExpression extends Expression { expr: ExpressionSegment; constructor(ast: ASTListConstructorExpression) { super(); const me = this; me.ast = ast; me.expr = null; } async prepare(visit: Function): Promise<ListExpression> { const me = this; const node = me.ast; me.expr = new ExpressionSegment( await Promise.all(node.fields.map((item: ASTListValue) => { return visit(item.value); })) ); return me; } get(operationContext: OperationContext, parentExpr: any): Promise<CustomList> { const me = this; const evaluate = async function(node: any[]): Promise<CustomList> { const traverselPath = [].concat(node); const list = []; let current; while (current = traverselPath.shift()) { if (isCustomValue(current)) { list.push(current); } else if (current instanceof Expression) { list.push(await current.get(operationContext)); } else if (current.value instanceof Operation) { list.push(await current.value.get(operationContext)); } else { operationContext.debugger.raise('Unexpected value in list.', me, current.value); } } return new CustomList(list); }; operationContext.debugger.debug('Line', me.ast.start.line, 'ListExpression', 'get', 'expr', me.expr); return evaluate(me.expr.values); } }
export * from './auth.service'; import { AuthService } from './auth.service'; export * from './developer.service'; import { DeveloperService } from './developer.service'; export * from './items.service'; import { ItemsService } from './items.service'; export * from './language.service'; import { LanguageService } from './language.service'; export * from './logs.service'; import { LogsService } from './logs.service'; export * from './module.service'; import { ModuleService } from './module.service'; export * from './sensor.service'; import { SensorService } from './sensor.service'; export * from './streaming.service'; import { StreamingService } from './streaming.service'; export * from './users.service'; import { UsersService } from './users.service'; export const APIS = [AuthService, DeveloperService, ItemsService, LanguageService, LogsService, ModuleService, SensorService, StreamingService, UsersService];
Serious renal and urological complications in fast-track primary total hip and knee arthroplasty; a detailed observational cohort study. BACKGROUND Overall medical complications have been reduced after fast-track total hip (THA) and knee arthroplasty (TKA), but data on specific renal and urological (RU) complications are limited. METHODS To describe the incidence and consequences of serious RU complications resulting in length of stay > 4 days or 30-day readmissions after fast-track THA and TKA, we conducted a detailed observational study based upon prospectively collected pre-operative data and a complete 30-day follow-up on complications and re-admissions in a unselected cohort of 8,804 consecutive fast-track THAs and TKAs. Our main outcomes were incidence, types and consequences of RU complications. RESULTS Of 8,804 procedures, 54 (0.61%) developed serious RU complications resulting in 38 (0.43%) prolonged hospitalisations and 17 (0.19%) readmissions. Acute kidney injury (AKI), defined as an increase in serum creatinine by ≥ 0.3 mg/dl or ≥ 1.5 times baseline, accounted for complications (0.49%), and was most frequently associated with postoperative hypotension Of the AKI patients, 25 (58.1%) had a preoperative estimated glomerular filtration rate < 60 2 43 ml/min/1.73 m and 16 of these had received a NSAID postoperatively. Seven complications. (0.08 %) were urological, mainly haematuria after bladder catheterisation, whereas 5 (0.06 were urosepsis/pyelonephritis. CONCLUSION The overall incidence of serious RU complications after fast-track THA and TKA was 0.61 %. AKI occurred in 0.49% and was most often due to pre-existing kidney disease and postoperative hypotension, calling for increased focus on perioperative fluid management and optimisation of the perioperative care of patients with pre-existing kidney disease.
/** * Function formatting a number or character to a hex value. */ public class HexFieldPart implements Part { private String fieldName; /** * Creates a new hex field part * @param fieldName the field name */ public HexFieldPart(String fieldName) { this.fieldName = fieldName; } /** {@inheritDoc} */ public boolean isGenerated(Map params) { Object obj = params.get(fieldName); return obj != null; } /** {@inheritDoc} */ public void write(StringBuffer sb, Map params) { if (!params.containsKey(fieldName)) { throw new IllegalArgumentException( "Message pattern contains unsupported field name: " + fieldName); } Object obj = params.get(fieldName); if (obj instanceof Character) { sb.append(Integer.toHexString(((Character)obj).charValue())); } else if (obj instanceof Number) { sb.append(Integer.toHexString(((Number)obj).intValue())); } else { throw new IllegalArgumentException("Incompatible value for hex field part: " + obj.getClass().getName()); } } /** {@inheritDoc} */ public String toString() { return "{" + this.fieldName + ",hex}"; } /** Factory for {@link HexFieldPart}. */ public static class Factory implements PartFactory { /** {@inheritDoc} */ public Part newPart(String fieldName, String values) { return new HexFieldPart(fieldName); } /** {@inheritDoc} */ public String getFormat() { return "hex"; } } }
FreeNAS 9.3 Released Here’s an early Christmas present for you all: FreeNAS 9.3! This FreeNAS update is a significant evolutionary step from previous FreeNAS releases, featuring a simplified and reorganized Web User Interface, support for Microsoft ODX and Windows 2012 clustering, better VMWare integration, including VAAI support, a new and more secure update system with roll-back functionality, and hundreds of other technology enhancements. We’re quite proud of it and excited to make it publicly available. You can get it here and the list of changes are here. We encourage all existing 9.2.x users and 9.3 beta testers to upgrade. Last month saw the release of FreeNAS 9.3-BETA. Thousands of users downloaded the beta. Here’s a quick glance at the improvements made to FreeNAS 9.3: Jordan Hubbard took some time to make a State of the Union video addressing the changes in 9.3 and discussing the plans for 10.x. If you haven’t already, you can see it here: Additionally, you can watch this video by Linda Kateley, FreeNAS instructor, for an in-depth overview of the changes: I enjoyed seeing many of you at MeetBSD in San Jose at the beginning of November and I hope everyone enjoys this release of FreeNAS! Brett Davis iXsystems Executive Vice President
The official release date for Tekken 7 has been revealed after Amazon tried to steal the show by revealing it on their own! The PlayStation Blog has confirmed a June 2nd release date for the game. The PS4 version will feature exclusive content, like a 'Jukebox Mode' which will allow you to create playlists composed of songs from previous games in the series and listen to them while playing the game. Additionally, exclusive costumes for certain characters from Tekken 4 and Tekken 2 for Jin, Xiaoyu, and King will be included in the PS4's exclusive content. Prior to the official reveal, Amazon listings for the Collector's Edition of Tekken 7 (on both PlayStation 4 and Xbox One) popped up and jumped the gun on the release date reveal. In addition to the game's release date being revealed through the listings, they also detailed the contents of the $150 Collector's Edition. The June 2nd, 2017 is a pretty decent push back from the 'Early 2017' release date that the trailer had suggested. The listings also revealed that the Collector's Edition will feature the following items: 12" x 18" Kazuya & Heihachi Figure Tekken 7 Original Game Soundtrack Steel Book Collector's Box Tekken 7 Game Software It looks like the content of Tekken 7 Collector's Edition is pretty similar to the Collector's Edition of Tekken Tag Tournament 2. The official product description from the Amazon store page can be found below: Raise your fists and get ready for the ultimate battle on the next generation of home consoles. Powered by the Unreal Engine 4, the storied fighting franchise returns for another round in Tekken 7. With the faithful 3D battle system and gameplay intact, Tekken 7 takes the franchise to the next level with photo-realistic graphics and new and innovative features and fighting mechanics. Tekken 7 resurrects the attitude, competitiveness and showmanship rooted in its arcade DNA to provide the ultimate fighting game experience. Tekken 7 will feature new gameplay mechanics like the Rage System, which comes into play when your life bar begins flashing red. Entering Rage mode will give your character more damaging hits and two new moves: the Rage Art and Rage Drive. [NeoGaf]
package de.jano1.sponge.regions_api; import de.jano1.sponge.regions_api.subfaces.PluginRelatable; import de.jano1.sponge.regions_api.subfaces.RegionParentable; import org.spongepowered.api.data.DataSerializable; import org.spongepowered.api.world.Location; import java.io.Serializable; /** * Created by Jano1 on 01.07.2017. */ public interface Region extends PluginRelatable, RegionParentable{ /** * Sets a new Shape for this region * @param shape The new shape */ public void setShape(RegionShape shape); /** * Get the region_id of this region * @return The region_id as string */ public String getID(); /** * Get the Shape of this Region * @return The shape of this region */ public RegionShape getShape(); /** * Get the Flags belonging to this Region * @return An Array of Flags */ public RegionFlag[] getFlags(); /** * Test if this region contains any flags * @return True id there are at least one flag, false otherwise */ public boolean hasFlags(); /** * Test if this region contains any flags that are relatet to * a specific plugin * @param plugin_id The plugin id to test * @return True if related, false otherwise */ public boolean hasFlagsRelatedTo(String plugin_id); /** * Same as @see{getFlags()}, but the returning Flags are all * belonging to the given plugin id * @param plugin_id The plugin you want the flags for * @return An Array of Flags */ public RegionFlag[] getFlagsRelatedTo(String plugin_id); /** * Set a Flag for this Region * @param flag The flag to set */ public void setFlag(RegionFlag flag); /** * Set an array of Flags for this region at once * @param flags The array which should be set */ public void setFlags(RegionFlag[] flags); /** * Test if this Region contains a specific Location * @param location The Location to test * @return True, if location is contained, false otherwise */ public boolean containsLocation(Location location); }
<reponame>Shiv10/casscade-backend<filename>src/models/submission.ts import mongoose from 'mongoose'; const Submissions = mongoose.model('submissions', new mongoose.Schema({ name: { type: String, required: true, }, score: { type: Number, reuiqred: true, }, })); export default Submissions;
<reponame>marble/Toolchain_RenderDocumentation #!/usr/bin/env python # coding: utf-8 from __future__ import print_function from __future__ import absolute_import import os import sys import tct from os.path import exists as ospe params = tct.readjson(sys.argv[1]) binabspath = sys.argv[2] facts = tct.readjson(params['factsfile']) milestones = tct.readjson(params['milestonesfile']) reason = '' resultfile = params['resultfile'] result = tct.readjson(resultfile) loglist = result['loglist'] = result.get('loglist', []) toolname = params['toolname'] toolname_pure = params['toolname_pure'] workdir = params['workdir'] exitcode = CONTINUE = 0 # ================================================== # Make a copy of milestones for later inspection? # -------------------------------------------------- if 0 or milestones.get('debug_always_make_milestones_snapshot'): tct.make_snapshot_of_milestones(params['milestonesfile'], sys.argv[1]) # ================================================== # Helper functions # -------------------------------------------------- def lookup(D, *keys, **kwdargs): result = tct.deepget(D, *keys, **kwdargs) loglist.append((keys, result)) return result # ================================================== # define # -------------------------------------------------- remove_static_folder_from_html_done = None xeq_name_cnt = 0 # ================================================== # Check params # -------------------------------------------------- if exitcode == CONTINUE: loglist.append('CHECK PARAMS') build_html_folder = lookup(milestones, 'build_html_folder', default=None) build_singlehtml_folder = lookup(milestones, 'build_singlehtml_folder', default=None) replace_static_in_html_done = lookup(milestones, 'replace_static_in_html_done', default=None) theme_module_path = lookup(milestones, 'theme_module_path') if not (1 and (build_html_folder or build_singlehtml_folder) and replace_static_in_html_done and theme_module_path ): CONTINUE = -2 reason = 'Bad PARAMS or nothing to do' if exitcode == CONTINUE: loglist.append('PARAMS are ok') else: loglist.append(reason) # ================================================== # work # -------------------------------------------------- if exitcode == CONTINUE: statics_to_keep = milestones.get('statics_to_keep', []) for build_folder in [build_html_folder, build_singlehtml_folder]: if not build_folder: continue startfolder = '_static' fixed_part_length = len(build_folder) fpath = os.path.join(build_folder, startfolder) if os.path.exists(fpath): for top, dirs, files in os.walk(fpath, topdown=False): for file in files: topfile = top + '/' + file relfile = topfile[fixed_part_length+1:] themefile = theme_module_path + '/' + relfile[1:] if not (relfile in statics_to_keep) and ospe(themefile): os.remove(topfile) if not os.listdir(top): os.rmdir(top) loglist.append('%s, %s' % ('remove', fpath)) remove_static_folder_from_html_done = 1 # ================================================== # Set MILESTONE # -------------------------------------------------- if remove_static_folder_from_html_done: result['MILESTONES'].append({ 'remove_static_folder_from_html_done': remove_static_folder_from_html_done}) # ================================================== # save result # -------------------------------------------------- tct.save_the_result(result, resultfile, params, facts, milestones, exitcode, CONTINUE, reason) # ================================================== # Return with proper exitcode # -------------------------------------------------- sys.exit(exitcode)
This invention relates to a process for producing stainless steel having a bright surface. More particularly, it relates to producing stainless steel having a bright annealed-like surface without the need for annealing in a controlled atmosphere. Austenitic and ferritic stainless steel strip is conventionally produced by converting an ingot of the steel to a slab which, after conditioning, is rolled on a continuous hot-strip mill to produce a hot-rolled band of intermediate gauge. The hot-rolled band is then cold rolled to final gauge. After cold rolling to final gauge, the surface of the stainless steel strip is bright and glossy. For various final product applications, it is necessary to preserve this surface finish. After cold rolling, however, it is also necessary to anneal the cold-rolled strip to recrystallize the cold-rolled structure and obtain a stress-free product. For austenitic stainless steel, this generally requires annealing temperatures on the order of 1900.degree. to 2000.degree. F. and for ferritic stainless steel temperatures on the order of 1500.degree. to 1600.degree. F. To preserve the cold-rolled surface finish during such annealing, it is performed in a controlled atmosphere furnace wherein both the atmosphere and dew point are regulated. Specifically, the furnace atmosphere may be hydrogen, hydrogen and nitrogen or cracked ammonium (75% hydrogen+25% nitrogen). Controlled atmosphere furnaces of this type are termed " bright annealing furnaces". The strip is passed continuously through the furnace during the annealing operation. Bright annealing cannot be performed in a batch-type furnace wherein the strip is in coil form because the convolutions of the coil will weld together during annealing to impair the surface finish of the strip. What is needed is a method to produce bright surface stainless steel of comparable surface quality to bright annealed stainless steel which is produced by conventional practices using a controlled atmosphere furnace. It is desirable to do so without capital cost of extensive new equipment, but through modification of existing facilities and processes.
HE OTHER AS A SACRED PRINCIPLE OF NATIONAL MEANINGS One of the most characteristic problems of our time is the crisis of collective and personal identity, as a result of the development of which there is a loss of sacred landmarks, necessary for a person to comprehend his own reality. In this regard, it is necessary to form a theory of identity that asserts the balance of personal and collective meanings. Since national identity is the leading one in the modern world, this paper proposes a hypothesis of the formation of the integration principle of the nation, which is understood as the sacred perception of the Other at the personal and collective level. The western and eastern models of nation-building are considered. In the first case, the destruction of the system of collective meanings is indicated, due to the excessive sacralization of the personality. In the second case, personal meanings are desacralized through the dissolution of the individual in the processes of collective sense formation. At the same time, in each case, interpersonal alienation and the weakening of social ties are recorded. As the main result of this work, the formation of an alternative model of the nation is proposed, in which the Other is perceived as a sacred meaning-forming landmark. The results of the article can be used as a method in the study of the nation and national identity. Introduction One of the most pressing topics of modern social philosophy is the problem of significant sociocultural changes, expressed in the form of accelerating the intensity of intercultural communications and global unification of worldview positions and lifestyles. As a result, a general crisis state associated with the loss of traditional collective identities is recorded. In view of the impossibility of complete reduction of the social to the individual and the individual to the social, "a person's identity should be considered as a result of the interaction of the sphere of the individual and the social" (Dryaeva & Kanaev, 2020, p. 621). In this regard, in modern science there are statements about the crisis state of a person as a whole, since, according to one of the research directions, human nature "... is a product of culture or social conditions, and the other asserts that there is no such thing as a universal human nature" (Gorodovich, 2020, p. 61). Thus, it is legitimate to talk about the crisis of a person's perception of his own reality. The source of reality, according to Eliade, is the sacred, towards the connection with which the innermost desire of a person is directed, "equivalent to his desire to find himself in objective reality <...>, to live in reality, in a real, and not in an illusory world" (p. 26). The sacred perception of identity consists of a balance of personal and collective meanings. The loss of such a balance in the picture of the world of a modern person arises in connection with the desacralization of the perception of reality. In this regard, it is necessary to form such a theory of identity, the integration principle of which is the sacred unity of personal and collective meanings. Problem Statement According to the theory, the provisions of which are used in this work, a person, in striving for the sacred, needs a system of orientational navigation, in connection with which the leading directions of the development of life meanings are built. Thus, Stegmaier argues that "Orientation as food and breathing is an elementary, necessary and unceasing vital need" (p. 2). According to the domestic researcher Smirnov, "Otherwise, a person cannot act and live, except to constantly orient himself. The latter is ontologically rooted" (p. 187). In connection with the adoption of these provisions, the main task of the article is formulated, which consists in identifying the sacred landmarks that determine the development and semantic unity of the process of personal and collective identification. Research Questions The leading hypothesis of this work is the idea of forming a single semantic development of national identification, since the political, cultural and religious meanings of the existence of a European person of the last two centuries have realized the quality of national ones. At the same time, in the conditions of intensive desacralization of the main meanings of life, it is the nation that is revealed as a religious community, forming around the sacred core, and European "... nationalisms derive concepts, concepts, images and references... from the Christian religious tradition" (Zygmont, 2019, p. 163). In addition, the fixation among the population of the European Union, the USA, the Russian Federation "... the phenomenon of interpersonal political and psychological tension <...> of individual hostility and https: //doi.org/10.15405/epsbs.2021.12.03.72 Corresponding Author: Sergey A. Shamin Selection and peer-review under responsibility of the Organizing Committee of the conference eISSN: 2357-1330 541 interpersonal sensitivity" (Dembitsky & Burova, 2020, p. 8), suggests the formation of a high level of mutual understanding between members of the nation as an indispensable condition for the process of nation-formation. In this regard, the integration principle of a modern nation must be considered the formation of perception by members of the nation of each other as a significant Other, which is understood as a sacred landmark in the continuous process of personal and collectivist identification. The provisions of the stated hypothesis find confirmation in the ideas of the philosophy of dialogue. Thus, according to Buber, the upbringing of the perception of a person as a significant Other is capable of forming a special space "Between", in which "I and You freely stand in interaction with each other" (p. 44). Also, Bakhtin, describing the ways of relationship with the Other, used the concepts of "getting used to" and "feeling" necessary to convey a state in which "The values of being of a qualitatively defined personality are inherent only to another" (p. 128). Acceptance of the provisions of the approved hypothesis is possible both at the interpersonal level, which contributes to consolidation within the community, and at the international level, which can be a way to prevent the emergence of Nazism and xenophobia. Purpose of the Study In connection with the designation of possible spheres of theoretical and practical application of the provisions of this hypothesis, it becomes possible to formulate the purpose of this article, which consists in describing the altruistic orientation to the Other as an integration principle of the nation, which presupposes the sacred unity of national meanings. The purpose of this work is to substantiate the need to form the perception of the Other as a sacred principle of the unity of national meanings using the methods of social constructivism, the phenomenological method, methods of polyparadigmatic reflection and intersubjectivism. Research Methods In this work, national identity is presented as a unity of collective and personal meanings. The interdisciplinarity of this approach implies the use of a complex methodology that includes various methods. Thus, the understanding of nation-formation as an identification process open to artificial reconstruction presupposes the use of the method of social constructivism. To consider the liberal and collectivist models of nation-building, the method of comparative analysis is used. The method of phenomenology is used to identify the sacred basis of identity and understanding it as a balance of personal and collective meanings requires the use of the method of polyparadigmatic reflection. In addition, the solution of problems included in the methodological field of theology and philosophy was carried out using the method of correlation, since, according to Tillich, who developed this method, "philosophy formulates those questions that are implicitly inherent in human existence, and theology formulates those answers that are implicit are inherent in divine self-manifestation under the influence of issues inherent in human existence" (p. 64). As a result, the method of intersubjectivism is https: //doi.org/10.15405/epsbs.2021.12.03.72 Corresponding Author: Sergey A. Shamin Selection and peer-review under responsibility of the Organizing Committee of the conference eISSN: 2357-1330 542 used to present the altruistic perception of the Other as a sacred integration principle of modern nationformation proposed in this work. Findings As a result of the study, the need for active national reforming is proved. Since national identity is a deep combination of personal and collective sacred meanings, the awareness of belonging to a nation is the guarantor of the formation of personal and collective identity. Discovered in the mainstream of anthropological and phenomenological studies, the desire of a person to acquire "the fullness of being" (Eliade, 1994, p. 16), presupposes building a personal life strategy depending on the chosen direction towards the sacred center. The system of sacred meanings of national identification is understood as navigation that guides the individual and the collective in acquiring sacred completeness. However, the directions of sacred navigation differ depending on the formulation of the integration principle of the nation. Thus, in the Western model of nation-formation, a personal strategy is built, which presupposes the discovery of a sacred source in the center of the personality. This strategy, sacralizing the personality, offers an understanding of the personality as an independent individual, whose selfidentification is perceived to be more and more independent of the collective identity. It is also noted that "Contemporary literature on identity prefers "psycho" over "social", which limits our understanding of the role of identity processes in society in a broader sense" (Rogers, 2018, p. 284). This tendency manifests itself in the form of profanation of collective meanings, which led to an increase in the level of hedonistic values, and, having individualistic roots, the desire to abandon the sacred boundaries of gender, race, ethnicity, etc., generates a strong feminist trend in the Euro-American political environment, which can lead to to desacralization of the European's cognitive picture of the world, since "feminism entails a rejection of deductive "enlightenment thinking" because it is based on a masculine cognitive style" (Olsson, 2018, p. 410). The parallel strategy of nation-formation presupposes an appeal to the sacred, localized outside the individual, but closes the navigational boundaries on the sacralized community. Thus, the nation itself is perceived as a sacred landmark. The obvious disadvantage of this strategy is the desacralization of personal meanings and the partial dissolution of individuality in the development of collective meanings. The implementation of this project is typical of Eastern nationalisms. In particular, the formation of a unified Chinese nation is based on the idea developed by Sun Yatsen, who received the status of "Father of the Nation". According to this idea, national unity should bridge regional and religious divisions. The results of studies carried out in educational institutions in China in 2020 show that there really is a "positive relationship between ethnic identity and Chinese national identity for adolescents of different nationalities" (, p. 1). The development of this strategy, with a concentration on collective sense formation and parallel profanation of personal meanings, becomes a fertile ground for the formation of authoritarian state regimes. Thus, the social credit system adopted in China, in which the state gets the opportunity to regulate a person's private life, indicates the profanation of the personal, intimate sphere. The Japanese model of society 5.0, in which human life is carried out with the help of artificial intelligence, points to the processes of desacralization of human intelligence and its gradual replacement with artificial https: //doi.org/10.15405/epsbs.2021.12.03.72 Corresponding Author: Sergey A. Shamin Selection and peer-review under responsibility of the Organizing Committee of the conference eISSN: 2357-1330 543 intelligence, which is "nothing more than a rule-based imitation of our life form, which will lure only those of us who buy into its illogical premises" (Baer, 2018, p. 17). As a third example, we can point to the multitude of pseudo-national entities that adopt the name of the nation in view of the social and political prestige of this category. These models of nation-building are not considered as a solution to the identification crisis, since the desacralization of collective meanings in the case of the Western model and the profanation of personal meaning formation in the case of the Eastern model do not represent a sacred semantic balance. In this regard, an alternative theory of sacral orientation is needed, combining the development of personal and collective meanings, which is realized in national interpersonal unity. The idea developed in the Western philosophical tradition, in the Nietzsche-Foucault-Deleuze continuity, is taken as the basic basis for formulating an alternative principle of nation-formation. Thus, Deleuze, in a work devoted to the philosophy of Foucault, writes: "Forces in a person constitute form only when they enter into relations with external forces" (p. 139). Further, Deleuze formulates a theory of the historical development of sacred orientation. So, according to Deleuze, the primary form taken as a source of sacred power is the Absolute or "Form God". This form is not conducive to personal and collective sense-making, locking the personality in direct connection with the Absolute. The next form, according to Deleuze, reveals the source of the sacred in man himself: "The form of man." The implementation of this strategy contributes to the Western nation-formation and, at a certain stage, achieves a balance of personal and collective meanings. However, immersion in the depths of the personality, in search of a sacred source, did not bring the expected results. On the contrary, the desacralization of the collective meanings of the Western nation became the reason for the permeability of their borders. Thus, European researchers record,... that a parallel world is being built within their states, based on different norms, values and ways of life. And this world is capable of destroying the integrity of a nation from within, undermining its inherent value basis, leading to conflicts and violence. (Starodubrovskaya, 2021, p. 147) The third form, according to Deleuze, is virtual orientation: "The form of the superman." Indeed, modern researchers record the departure of human everyday life into cyberspace, which makes it possible to carry out identification practices. Realizing everyday life on the net, individuals, thus, do not manifest themselves as passive recipients of nationalist discourses, but make an active contribution to them, posting their photographs in folkloristic costumes or during training in the gym. In this way, they can either strengthen or resist dominant narratives and participate in the formation of nations. (, p. 63) However, the withdrawal of private life into cyberspace becomes the cause of social atomization, due to the lack of live communication between people. Thus, the third form of sacred orientation in the chain: God -Man -Superman, should be replaced by the form "Another person", which should be understood as a person, but not his own, which is typical https://doi.org/10.15405/epsbs.2021.12.03.72 Corresponding Author: Sergey A. Shamin Selection and peer-review under responsibility of the Organizing Committee of the conference eISSN: 544 for the Western national strategy, but the personality of another person -with a personal position, and another nation from a collectivist position. The adoption of this strategy as an integration principle of nation-formation should contribute to the reorientation of sacred navigation from an egoistic route that destroys identity and excludes personal and collective identification from the sphere of objective reality to a spiritual altruistic orientation, which forms the basis of the moral component of the spiritual abilities of the individual. Conclusion So, the basis of life sense-making is the striving for the sacred as being, possessing ontological superiority. This aspiration is based on building sacred routes, assuming the presence of sacred landmarks, laid in two areas: personal and individual. Only the relationship of balance between these spheres provides the opportunity for personal self-identity and collective identity. Destabilizing processes in the spheres of culture, politics and sociality cause an identification crisis, the removal of which is possible only through the balanced development of personal and collective identification. The identification processes of our time are carried out mainly through nation-formation, during which a system of personal and collective sacred orientation is built. However, taking into account the difference in the routes of sacral orientation, in the modern nation-formation there are two main strategies that can be tentatively titled as "western" and "eastern". Western nation-formation builds a sacred route oriented towards the personality, in connection with which, at a certain stage, the processes of desacralization of collective meanings are triggered, which soon causes a general identification crisis. The eastern strategy of nation formation, on the contrary, reveals the source of the sacred in the nation itself, dissolving personal meanings in a collective sensemaking. Thus, the existing strategies are not a solution to the problem of the identity crisis, since they do not form the connection between the individual and the collective, which is necessary for identity, when I and They are transformed into We. The main conclusion of the study is a description of the need to form an alternative identification principle that builds identity on the basis of a deep connection between individuals within a team and communities in the global space. Such a principle is the perception of the Other as a sacred reference point in the realization of the unity of personal and collective meanings of national identification.
import { PhysicalItemData, TraitChatData } from '@item/data'; import { LocalizePF2e } from '@module/system/localize'; import { Rarity } from '@module/data'; import { ItemPF2e } from '@item/index'; import type { ContainerPF2e } from '@item/index'; import { MystifiedTraits } from '@item/data/values'; import { getUnidentifiedPlaceholderImage } from '../identification'; import { IdentificationStatus, MystifiedData, PhysicalItemTrait } from './data'; import { coinsToString, extractPriceFromItem } from '@item/treasure/helpers'; export abstract class PhysicalItemPF2e extends ItemPF2e { // The cached container of this item, if in a container, or null private _container: Embedded<ContainerPF2e> | null = null; get level(): number { return this.data.data.level.value; } get rarity(): Rarity { return this.data.data.traits.rarity.value; } get quantity(): number { return this.data.data.quantity.value ?? 1; } get isEquipped(): boolean { return this.data.isEquipped; } get price(): string { return this.data.data.price.value; } get identificationStatus(): IdentificationStatus { return this.data.data.identification.status; } get isIdentified(): boolean { return this.data.isIdentified; } get isAlchemical(): boolean { return this.data.isAlchemical; } get isMagical(): boolean { const traits: Set<string> = this.traits; const magicTraits = ['magical', 'arcane', 'primal', 'divine', 'occult'] as const; return magicTraits.some((trait) => traits.has(trait)); } get isInvested(): boolean | null { const traits: Set<string> = this.traits; if (!traits.has('invested')) return null; return this.data.isEquipped && this.data.isIdentified && this.data.data.invested?.value === true; } get isCursed(): boolean { return this.data.isCursed; } get material() { const systemData = this.data.data; return systemData.preciousMaterial.value && systemData.preciousMaterialGrade.value ? { type: systemData.preciousMaterial.value, grade: systemData.preciousMaterialGrade.value, } : null; } get isInContainer(): boolean { return !!this.container; } /** Get this item's container, returning null if it is not in a container */ get container(): Embedded<ContainerPF2e> | null { if (this.data.data.containerId.value === null) return (this._container = null); const container = this._container ?? this.actor?.items.get(this.data.data.containerId.value ?? ''); if (container?.type === 'backpack') this._container = container as Embedded<ContainerPF2e>; return this._container; } override prepareBaseData(): void { super.prepareBaseData(); const systemData = this.data.data; // null out empty-string values systemData.preciousMaterial.value ||= null; systemData.preciousMaterialGrade.value ||= null; systemData.containerId.value ||= null; // Normalize price string systemData.price.value = coinsToString(extractPriceFromItem(this.data, 1)); this.data.isEquipped = systemData.equipped.value; this.data.isIdentified = systemData.identification.status === 'identified'; const traits = this.traits; this.data.isAlchemical = traits.has('alchemical'); this.data.isCursed = traits.has('cursed'); // Magic and invested status is determined at the class-instance level since it can be updated later in data // preparation this.data.isMagical = this.isMagical; this.data.isInvested = this.isInvested; // Set the _container cache property to null if it no longer matches this item's container ID if (this._container?.id !== this.data.data.containerId.value) { this._container = null; } } /** Refresh certain derived properties in case of special data preparation from subclasses */ override prepareDerivedData(): void { super.prepareDerivedData(); this.data.isMagical = this.isMagical; this.data.isInvested = this.isInvested; const systemData = this.data.data; systemData.identification.identified = { name: this.name, img: this.img, data: { description: { value: this.description }, }, }; // Update properties according to identification status const mystifiedData = this.getMystifiedData(this.identificationStatus); mergeObject(this.data, mystifiedData, { insertKeys: false, insertValues: false }); // Fill gaps in unidentified data with defaults systemData.identification.unidentified = this.getMystifiedData('unidentified'); } /** Can the provided item stack with this item? */ isStackableWith(item: PhysicalItemPF2e): boolean { if (this.type !== item.type || this.name != item.name || this.isIdentified != item.isIdentified) return false; const thisData = this.toObject().data; const otherData = item.toObject().data; thisData.quantity.value = otherData.quantity.value; thisData.equipped.value = otherData.equipped.value; thisData.containerId.value = otherData.containerId.value; thisData.identification = otherData.identification; return JSON.stringify(thisData) === JSON.stringify(otherData); } /* Retrieve subtitution data for an unidentified or misidentified item, generating defaults as necessary */ getMystifiedData(status: IdentificationStatus, _options?: Record<string, boolean>): MystifiedData { const mystifiedData: MystifiedData = this.data.data.identification[status]; const name = mystifiedData.name || this.generateUnidentifiedName(); const img = mystifiedData.img || getUnidentifiedPlaceholderImage(this.data); const description = mystifiedData.data.description.value || (() => { if (status === 'identified') return this.description; const formatString = LocalizePF2e.translations.PF2E.identification.UnidentifiedDescription; const itemType = this.generateUnidentifiedName({ typeOnly: true }); const caseCorrect = (noun: string) => game.i18n.lang.toLowerCase() === 'de' ? noun : noun.toLowerCase(); return game.i18n.format(formatString, { item: caseCorrect(itemType) }); })(); return { name, img, data: { description: { value: description, }, }, }; } override getChatData(): Record<string, unknown> { return { rarity: CONFIG.PF2E.rarityTraits[this.rarity], description: { value: this.description }, }; } async setIdentificationStatus(status: IdentificationStatus): Promise<void> { if (this.identificationStatus === status) return; await this.update({ 'data.identification.status': status, 'data.identification.unidentified': this.getMystifiedData('unidentified'), }); } generateUnidentifiedName({ typeOnly = false }: { typeOnly?: boolean } = { typeOnly: false }): string { const itemType = game.i18n.localize(`ITEM.Type${this.data.type.capitalize()}`); if (typeOnly) return itemType; const formatString = LocalizePF2e.translations.PF2E.identification.UnidentifiedItem; return game.i18n.format(formatString, { item: itemType }); } /** Include mystification-related rendering instructions for views that will display this data. */ protected override traitChatData(dictionary: Record<string, string>): TraitChatData[] { const traitData = super.traitChatData(dictionary); for (const trait of traitData) { trait.mystified = !this.isIdentified && MystifiedTraits.has(trait.value); trait.excluded = trait.mystified && !game.user.isGM; if (trait.excluded) { delete trait.description; } else if (trait.mystified) { const gmNote = LocalizePF2e.translations.PF2E.identification.TraitGMNote; trait.description = trait.description ? `${gmNote}\n\n${game.i18n.localize(trait.description)}` : gmNote; } } return traitData; } } export interface PhysicalItemPF2e { readonly data: PhysicalItemData; get traits(): Set<PhysicalItemTrait>; }
<filename>src/MappingSource.h #pragma once #include "ofMain.h" #include "FboSource.h" #include "Constants.h" #include "Settings.h" #include "Fonts.h" #include "Laureate.h" #include "LaureateDisplay.h" #include "StrobeElements.h" class MappingSource : public ofx::piMapper::FboSource { public: MappingSource(); void setup(); void update(); void draw(); void drawDebugGrid(); void setStrobeElements(shared_ptr<bmbf::nobel::StrobeElements> sr); shared_ptr<bmbf::nobel::Settings> settings; shared_ptr<bmbf::nobel::Fonts> fonts; shared_ptr<bmbf::nobel::LaureateDisplay> laureateDisplay; int debugGridStep; };
<filename>queries/strategyMirrorTradeTriggerExists.go package queries import ( "database/sql" "fmt" "strings" "github.com/stellar/kelp/api" ) // sqlQueryStrategyMirrorTradeTriggerExists queries the strategy_mirror_trade_triggers table by market_id and txid (primary key) to see if the row exists const sqlQueryStrategyMirrorTradeTriggerExists = "SELECT * FROM strategy_mirror_trade_triggers WHERE market_id = $1 AND txid = $2" // StrategyMirrorTradeTriggerExists is a query that fetches the row by primary key type StrategyMirrorTradeTriggerExists struct { db *sql.DB sqlQuery string marketID string } var _ api.Query = &StrategyMirrorTradeTriggerExists{} // MakeStrategyMirrorTradeTriggerExists makes the StrategyMirrorTradeTriggerExists query func MakeStrategyMirrorTradeTriggerExists(db *sql.DB, marketID string) (*StrategyMirrorTradeTriggerExists, error) { if db == nil { return nil, fmt.Errorf("the provided db should be non-nil") } return &StrategyMirrorTradeTriggerExists{ db: db, sqlQuery: sqlQueryStrategyMirrorTradeTriggerExists, marketID: marketID, }, nil } // Name impl. func (q *StrategyMirrorTradeTriggerExists) Name() string { return "StrategyMirrorTradeTriggerExists" } // QueryRow impl. func (q *StrategyMirrorTradeTriggerExists) QueryRow(args ...interface{}) (interface{}, error) { if len(args) != 1 { return nil, fmt.Errorf("expected 1 args (txid string), but got args %v", args) } else if _, ok := args[0].(string); !ok { return nil, fmt.Errorf("input arg[0] needs to be of type 'string', but was of type '%T'", args[0]) } row := q.db.QueryRow(q.sqlQuery, q.marketID, args[0]) var marketID, txID, backingMarketID, backingOrderID string e := row.Scan(&marketID, &txID, &backingMarketID, &backingOrderID) if e != nil { if strings.Contains(e.Error(), "no rows in result set") { return false, nil } return nil, fmt.Errorf("could not read data from StrategyMirrorTradeTriggerExists query: %s", e) } return true, nil }
def save_models(self): error_models = [error.create_model() for error in self._mapping.values()] Error.objects.save_job_error_models(self.job_type_name, error_models)
<filename>burger_war_dev/scripts/judge_listener.py<gh_stars>0 #!/usr/bin/env python # -*- coding: utf-8 -*- ''' This is judge_listener node. subscribe No topcs. Publish 'judge_info' topic. mainly use for simple sample program by yoda-ocu. ''' import rospy import rosparam from std_msgs.msg import String from burger_war_dev.msg import war_state import requests import json class JudgeListener(): def __init__(self,side, judge_url, bot_name="NoName"): # bot name self.name = bot_name self.side = side self.url = judge_url + "warState" self.war_state_pub = rospy.Publisher('war_state_info', war_state, queue_size=10) rospy.loginfo("College-Friends side is {}".format(self.side)) if requests.get(self.url).ok: self.strategy() else: rospy.logerr("Failed to fecth warState : {}".format(self.url)) def fetchState(self): resp = requests.get(self.url) resp_json = resp.json() state = war_state() state.time = resp_json['time'] state.state = resp_json['state'] state.my_side = self.side state.my_point = resp_json['scores'][self.side] state.target_names = [] state.target_owner = [] state.target_point = [] for target in resp_json['targets']: state.target_names.append(target['name']) state.target_owner.append(target['player']) state.target_point.append(int(target['point'])) if self.side == 'r': state.enemy_point = resp_json['scores']['b'] elif self.side == 'b': state.enemy_point = resp_json['scores']['r'] else: rospy.logerr("UNEXPECTED SIDE NAME : {}".format(self.side)) raise Exception("UNEXPECTED SIDE NAME : {}".format(self.side)) return state def strategy(self): r = rospy.Rate(1) # change speed 1fps while not rospy.is_shutdown(): info = self.fetchState() self.war_state_pub.publish(info) r.sleep() if __name__ == '__main__': rospy.init_node('judge_node') bot = JudgeListener(side=rosparam.get_param("cf_side"), judge_url=rosparam.get_param("judge_url"))
// Autogenerated by Thrift Compiler (0.9.3) // DO NOT EDIT UNLESS YOU ARE SURE THAT YOU KNOW WHAT YOU ARE DOING package microservice import ( "bytes" "fmt" "git.apache.org/thrift.git/lib/go/thrift" ) // (needed to ensure safety because of naive import list construction.) var _ = thrift.ZERO var _ = fmt.Printf var _ = bytes.Equal var GoUnusedProtection__ int // Attributes: // - ID // - Firstname // - Lastname // - Email // - Age // - Active type Person struct { ID int32 `thrift:"id,1,required" json:"id"` Firstname string `thrift:"firstname,2,required" json:"firstname"` Lastname *string `thrift:"lastname,3" json:"lastname,omitempty"` Email *string `thrift:"email,4" json:"email,omitempty"` Age int16 `thrift:"age,5,required" json:"age"` Active bool `thrift:"active,6,required" json:"active"` } func NewPerson() *Person { return &Person{} } func (p *Person) GetID() int32 { return p.ID } func (p *Person) GetFirstname() string { return p.Firstname } var Person_Lastname_DEFAULT string func (p *Person) GetLastname() string { if !p.IsSetLastname() { return Person_Lastname_DEFAULT } return *p.Lastname } var Person_Email_DEFAULT string func (p *Person) GetEmail() string { if !p.IsSetEmail() { return Person_Email_DEFAULT } return *p.Email } func (p *Person) GetAge() int16 { return p.Age } func (p *Person) GetActive() bool { return p.Active } func (p *Person) IsSetLastname() bool { return p.Lastname != nil } func (p *Person) IsSetEmail() bool { return p.Email != nil } func (p *Person) Read(iprot thrift.TProtocol) error { if _, err := iprot.ReadStructBegin(); err != nil { return thrift.PrependError(fmt.Sprintf("%T read error: ", p), err) } var issetID bool = false var issetFirstname bool = false var issetAge bool = false var issetActive bool = false for { _, fieldTypeId, fieldId, err := iprot.ReadFieldBegin() if err != nil { return thrift.PrependError(fmt.Sprintf("%T field %d read error: ", p, fieldId), err) } if fieldTypeId == thrift.STOP { break } switch fieldId { case 1: if err := p.readField1(iprot); err != nil { return err } issetID = true case 2: if err := p.readField2(iprot); err != nil { return err } issetFirstname = true case 3: if err := p.readField3(iprot); err != nil { return err } case 4: if err := p.readField4(iprot); err != nil { return err } case 5: if err := p.readField5(iprot); err != nil { return err } issetAge = true case 6: if err := p.readField6(iprot); err != nil { return err } issetActive = true default: if err := iprot.Skip(fieldTypeId); err != nil { return err } } if err := iprot.ReadFieldEnd(); err != nil { return err } } if err := iprot.ReadStructEnd(); err != nil { return thrift.PrependError(fmt.Sprintf("%T read struct end error: ", p), err) } if !issetID { return thrift.NewTProtocolExceptionWithType(thrift.INVALID_DATA, fmt.Errorf("Required field ID is not set")) } if !issetFirstname { return thrift.NewTProtocolExceptionWithType(thrift.INVALID_DATA, fmt.Errorf("Required field Firstname is not set")) } if !issetAge { return thrift.NewTProtocolExceptionWithType(thrift.INVALID_DATA, fmt.Errorf("Required field Age is not set")) } if !issetActive { return thrift.NewTProtocolExceptionWithType(thrift.INVALID_DATA, fmt.Errorf("Required field Active is not set")) } return nil } func (p *Person) readField1(iprot thrift.TProtocol) error { if v, err := iprot.ReadI32(); err != nil { return thrift.PrependError("error reading field 1: ", err) } else { p.ID = v } return nil } func (p *Person) readField2(iprot thrift.TProtocol) error { if v, err := iprot.ReadString(); err != nil { return thrift.PrependError("error reading field 2: ", err) } else { p.Firstname = v } return nil } func (p *Person) readField3(iprot thrift.TProtocol) error { if v, err := iprot.ReadString(); err != nil { return thrift.PrependError("error reading field 3: ", err) } else { p.Lastname = &v } return nil } func (p *Person) readField4(iprot thrift.TProtocol) error { if v, err := iprot.ReadString(); err != nil { return thrift.PrependError("error reading field 4: ", err) } else { p.Email = &v } return nil } func (p *Person) readField5(iprot thrift.TProtocol) error { if v, err := iprot.ReadI16(); err != nil { return thrift.PrependError("error reading field 5: ", err) } else { p.Age = v } return nil } func (p *Person) readField6(iprot thrift.TProtocol) error { if v, err := iprot.ReadBool(); err != nil { return thrift.PrependError("error reading field 6: ", err) } else { p.Active = v } return nil } func (p *Person) Write(oprot thrift.TProtocol) error { if err := oprot.WriteStructBegin("Person"); err != nil { return thrift.PrependError(fmt.Sprintf("%T write struct begin error: ", p), err) } if err := p.writeField1(oprot); err != nil { return err } if err := p.writeField2(oprot); err != nil { return err } if err := p.writeField3(oprot); err != nil { return err } if err := p.writeField4(oprot); err != nil { return err } if err := p.writeField5(oprot); err != nil { return err } if err := p.writeField6(oprot); err != nil { return err } if err := oprot.WriteFieldStop(); err != nil { return thrift.PrependError("write field stop error: ", err) } if err := oprot.WriteStructEnd(); err != nil { return thrift.PrependError("write struct stop error: ", err) } return nil } func (p *Person) writeField1(oprot thrift.TProtocol) (err error) { if err := oprot.WriteFieldBegin("id", thrift.I32, 1); err != nil { return thrift.PrependError(fmt.Sprintf("%T write field begin error 1:id: ", p), err) } if err := oprot.WriteI32(int32(p.ID)); err != nil { return thrift.PrependError(fmt.Sprintf("%T.id (1) field write error: ", p), err) } if err := oprot.WriteFieldEnd(); err != nil { return thrift.PrependError(fmt.Sprintf("%T write field end error 1:id: ", p), err) } return err } func (p *Person) writeField2(oprot thrift.TProtocol) (err error) { if err := oprot.WriteFieldBegin("firstname", thrift.STRING, 2); err != nil { return thrift.PrependError(fmt.Sprintf("%T write field begin error 2:firstname: ", p), err) } if err := oprot.WriteString(string(p.Firstname)); err != nil { return thrift.PrependError(fmt.Sprintf("%T.firstname (2) field write error: ", p), err) } if err := oprot.WriteFieldEnd(); err != nil { return thrift.PrependError(fmt.Sprintf("%T write field end error 2:firstname: ", p), err) } return err } func (p *Person) writeField3(oprot thrift.TProtocol) (err error) { if p.IsSetLastname() { if err := oprot.WriteFieldBegin("lastname", thrift.STRING, 3); err != nil { return thrift.PrependError(fmt.Sprintf("%T write field begin error 3:lastname: ", p), err) } if err := oprot.WriteString(string(*p.Lastname)); err != nil { return thrift.PrependError(fmt.Sprintf("%T.lastname (3) field write error: ", p), err) } if err := oprot.WriteFieldEnd(); err != nil { return thrift.PrependError(fmt.Sprintf("%T write field end error 3:lastname: ", p), err) } } return err } func (p *Person) writeField4(oprot thrift.TProtocol) (err error) { if p.IsSetEmail() { if err := oprot.WriteFieldBegin("email", thrift.STRING, 4); err != nil { return thrift.PrependError(fmt.Sprintf("%T write field begin error 4:email: ", p), err) } if err := oprot.WriteString(string(*p.Email)); err != nil { return thrift.PrependError(fmt.Sprintf("%T.email (4) field write error: ", p), err) } if err := oprot.WriteFieldEnd(); err != nil { return thrift.PrependError(fmt.Sprintf("%T write field end error 4:email: ", p), err) } } return err } func (p *Person) writeField5(oprot thrift.TProtocol) (err error) { if err := oprot.WriteFieldBegin("age", thrift.I16, 5); err != nil { return thrift.PrependError(fmt.Sprintf("%T write field begin error 5:age: ", p), err) } if err := oprot.WriteI16(int16(p.Age)); err != nil { return thrift.PrependError(fmt.Sprintf("%T.age (5) field write error: ", p), err) } if err := oprot.WriteFieldEnd(); err != nil { return thrift.PrependError(fmt.Sprintf("%T write field end error 5:age: ", p), err) } return err } func (p *Person) writeField6(oprot thrift.TProtocol) (err error) { if err := oprot.WriteFieldBegin("active", thrift.BOOL, 6); err != nil { return thrift.PrependError(fmt.Sprintf("%T write field begin error 6:active: ", p), err) } if err := oprot.WriteBool(bool(p.Active)); err != nil { return thrift.PrependError(fmt.Sprintf("%T.active (6) field write error: ", p), err) } if err := oprot.WriteFieldEnd(); err != nil { return thrift.PrependError(fmt.Sprintf("%T write field end error 6:active: ", p), err) } return err } func (p *Person) String() string { if p == nil { return "<nil>" } return fmt.Sprintf("Person(%+v)", *p) }
Many passenger seats such as those on passenger aircraft, buses, trains, and the like are arranged so that each passenger seat, other than the forward-most located passenger seats, faces the back of the next forward passenger seat. To increase a passenger's comfort, many passenger seat backs rotate between upright and reclined positions. In some instances, a tray table may be mounted adjacent to the back of each passenger seat for use by a passenger in the next aft passenger seat. The tray table is deployed by the passenger to provide a relatively flat surface for eating, working, recreation, or other uses. In many conventional uses, the tray table may be mounted to the back of each passenger seat via a pair of retractable arms that allow the tray table to be pulled toward the passenger while deployed. Conventionally, the tray tables are mounted indirectly to the base of the passenger seat, and are not mounted to the seat back frame of the passenger seat. This allows the seat back to recline while not affecting the use of the tray for an aft passenger. The separation of seat back frame and tray table also helps to isolate the tray table from vibration or other disturbances. Recently, there has been movement in carrier industries away from passenger comfort and convenience towards light-weight, compact seating arrangements. Many common carriers have moved away from reclining seats, particularly on shorter routes, to allow for lighter seats and reduced space requirements for passengers. This change in focus has led to non-reclining seats, and new opportunities for advancement of tray table technology. In particular, a tray table mounted directly to a seat back frame offers a number of advantages over conventional tray table assemblies. The present invention discloses, among other things, a tray table that is pivotally coupled to the seat back frame of a non-reclining passenger seat. This arrangement allows for a simplified mounting and installation of the tray table, which saves weight, material costs, and labor costs during installation. A tray table pivotally coupled to the seat back frame is also a more reliable design, with fewer moving parts and potential pinch points than the traditional tray table mounting arrangements. Furthermore, pivotally coupling a tray table to a non-reclining seat back frame saves space. There is less space required between the aft edge of the seat back and the forward edge of the deployed tray table due to the fixed nature of the seat back frame. There are also opportunities for coupling the tray table to the seat back frame in such a way as to recess the tray table into the seat back in its stowed position, further reducing passenger space requirements and allowing for more compact seating arrangements.
Burned by a scandal in its offer business over inappropriate promos, Offerpal Media is moving to set standards that forbid offers that are misleading, deceptive or otherwise objectionable. The action is the first move the company has made since it brought in a new chief executive, George Garrick, a couple of weeks ago. The CEO transition happened in the midst of a debate over scam offers that were allegedly duping users into mobile subscription deals and other purchases that they really didn’t want. Offers are special ads that are used to monetize otherwise free social games on Facebook and other social networks. For example, a user will agree to sign up for a Netflix subscription or credit card offer. The business has been lucrative because the offers guaranteed some kind of direct benefit for the advertisers, who are willing to pay more. And these offers are appealing to gamers who didn’t want to pay for virtual goods with credit cards. Offer companies like Offerpal, Super Rewards, Gambit, Trialpay and others have been at the forefront of the boom in virtual goods, which are expected to generate $1 billion in U.S. revenues in 2009. But as with other online media, complaints arose that some offers were scams. The scandal ignited last month as Techcrunch editor Michael Arrington challenged former Offerpal CEO Anu Shukla over the legitimacy of most offers at the Virtual Goods Summit. Shukla said that less than 1 percent of the offers have generated complaints from users. But the ensuing debate led to changes in policy regarding offers at Zynga, RockYou, and other companies. Facebook renewed its crackdown on scams and actually stopped one of Zynga’s games, FishVille, from launching until Zynga removed controversial offers. Zynga has also stopped using offers until it can set up a system to properly vet them. In an interview, Garrick said that the move is a result of some actions he took upon becoming CEO. He said that the new standards were developed in close contact with publishers such as Zynga and platform owners such as Facebook. He said that Offerpal has improved its tools that can automatically detect non-compliant offers and remove them from the system. Garrick referred to this as an “electronic watch dog.” He noted that Offerpal has pulled down hundreds of offers in recent weeks and has been putting them back up as they become compliant. Garrick noted, for instance, that the cost of an offer to consumers is now more clearly spelled out, with bigger font sizes, so there will be fewer complaints about offers containing “fine print” that cost consumers money. Offerpal’s move may help head off litigation. Lawyers have been fishing for clients for possible class-action lawsuits against the industry, on behalf of harmed consumers. Garrick admitted recently that Offerpal’s enforcement of its standards weren’t as good as they could have been, and he confirmed that view in the interview today. “I personally try to avoid situations that are combative or confrontational,” Garrick said. Garrick said that Shukla will continue as a board member and an advisor, but he said that she will not have an operational role in the company. Offerpal says it is the first to offer a set of guidelines for advertisers. These include a multi-step review process before each offer goes live and automated processes for continually verifying the current offers. Offerpal will continue to do some things it says it has always done, like personally handling each complaint that comes in. The company says it has remained in close contact with Facebook, MySpace and other social platforms, which have moved toward a “zero tolerance” position on offer scams. Offerpal also says it has added many high-value offers from recognizable brands such as Real Networks, ProFlowers, New York Times, DirecTV, Disney, Yahoo, Blockbuster, Netflix, Priceline, Discover Card, Columbia House, Nielsen, Gevalia and many more.
// Description: Snippets import { cmd } from "../core/utils.js" import { Script } from "../types/core.js" setName(``) setFlags({ ["new"]: { name: "New", description: "Create a new snippet script", shortcut: `${cmd}+n`, }, ["edit"]: { name: "Edit", description: "Open the selected script in your editor", shortcut: `${cmd}+o`, }, remove: { name: "Remove", description: "Delete the selected script", shortcut: `${cmd}+shift+backspace`, }, }) let snippetScript = await selectScript( { placeholder: "Select a snippet", footer: `create snippet: ${cmd}+n | edit snippet: ${cmd}+o | remove snippet: ${cmd}+shift+delete`, }, true, scripts => scripts.filter(script => script?.snippet) ) if (flag?.new) { let script = await arg("Enter a Snippet Script Name") await run( `${kitPath("cli", "new")}.js --template snippet ${script .trim() .replace(/\s/g, "-") .toLowerCase()} --scriptName '${script.trim()}'` ) } else if (flag?.edit) { await edit((snippetScript as Script).filePath, kenvPath()) } else if (flag?.remove) { await run( `${kitPath("cli", "remove")}.js ${ (snippetScript as Script).filePath } ` ) } else { await run((snippetScript as Script).filePath) } export {}
/** * Creates the end method * * @throws Exception */ private void createEndMethod() throws Exception { Method end = getMethod(new Id("end-step"), false); if (end == null) { end = new Method(new Id("end")); addMethod(end); } end.id = new Id("end"); if (end.arguments.size() >= 2) throw new Exception("Wrong number of argument for the end-step method"); Id tickId = new Id("tick"); if (end.arguments.size() == 1) { tickId = end.arguments.get(0).id; } end.arguments.clear(); end.addArgument(new Variable(tickId, SparkModel.getInstance().getType(new Id("$long")))); if (end.getReturnType() == null || end.getReturnType() == SparkModel.getInstance().getType(new Id("boolean"))) { if (end.getReturnType() == null) { int n = end.sourceCode.size(); end.sourceCode.insert(new Symbol("return", sym.IDENTIFIER, "return", -1, -1), n); end.sourceCode.insert(new Symbol("false", sym.IDENTIFIER, "false", -1, -1), n + 1); } end.setReturnType(SparkModel.getInstance().getType(new Id("boolean"))); } else { throw new Exception("end-step method should be of type boolean (or void)"); } }
From the Eagle’s Eye, a rotating student photography exhibit on the fourth floor of Maloney Hall, accepts submissions based on a new theme every semester. The Spring 2018 exhibit’s theme is “gratitude,” and the submissions were all extraordinarily touching and personal. Many of the pieces included a note about the picture—who it was taken of or why it’s important. Students who have no background in photography are also encouraged to submit their pictures, with the hope that the exhibit will allow viewers to get a glimpse of everyday Boston College life. The exhibit contained pieces with subjects ranging from nature, to family and friends, to volunteer cohorts. It truly illustrated just how diverse gratitude can be. In a stunningly simple picture submitted by Lori Niehaus, MCAS ’18, two monkeys gently hold on to each other while looking out into the green, leafy horizon. The tender serenity exemplified in the monkeys’ stance instantly puts the viewer at ease. The photograph illustrates how important it is to enjoy the simplicities of nature. It is an important reminder to open our eyes and absorb the beautiful world we live in. Students should put down their phones and look up to the sky and around them to fully take in BC’s stunning campus, or else the time will pass right before their eyes—even the brightest fall foliage only lasts but a few days. Many students submitted photography taken on service trips or while studying abroad. These pictures were touching, as they often involved students interacting with the youth of an underprivileged country. Others were centered on unique landscapes that can’t be replicated or found anywhere in the United States. An impressive shot from abroad was submitted by Sadiq Ervin, LSOE ’19, and featured a young child standing over a large crowd of pigeons. Ervin’s caption read “[pigeons] are much more beautiful outside of street corners and train platforms,” illustrating the sheer beauty of nature when one takes the time to truly examine it. This candid photograph looks professional—the child’s shadow is cast perfectly in a space without pigeons and the large group of birds contrasts with the water in the background. Some of the pieces were organized strategically throughout the reception area. For example, two pictures with a strikingly bright blue as the main color were positioned next to each other. The photograph on top, by Kerri Evans, SSW ’20, was a close-up of beautiful bluish-purple flowers. Directly underneath it was a picture of a baby crawling in the grass by Ivelisse Mandato, MCAS ’20. The baby had large, clear blue eyes and was wearing a blue shirt and socks. The different shades of blue in these two pictures complemented each other, and the placement of the two pieces almost presented them as one whole. There are 14,125 different ways Boston College is being experienced at this very moment. From the Eagle’s Eye strives to exhibit these different personal journeys. By displaying work of students from different walks of life, in different years of schooling, with different levels of photography skills, From the Eagle’s Eye effectively illuminates BC through the unique lives of its students.
A financial trader who appeared on the BBC and said he dreamed of making money from another recession was not a hoaxer, the broadcaster has said. Users of Twitter have cast doubt on Alessio Rastani's credentials. But the BBC said: "We've carried out detailed investigations and can't find any evidence to suggest that the interview... was a hoax." On his website Mr Rastani says he is "an experienced stock market and forex trader and professional speaker". In a live interview, broadcast on the BBC News Channel on Monday, Mr Rastani said: "For most traders, we don't really care that much how they're going to fix the economy, how they're going to fix the whole situation - our job is to make money from it. He then added: "The governments don't rule the world. [Investment bank] Goldman Sachs rules the world. Goldman Sachs does not care about this rescue package, neither does the big funds." After Twitter speculation that he was a member of hoaxers The Yes Men, the BBC press office made enquiries and concluded: "He is an independent market trader and one of a range of voices we've had on air to talk about the recession." The Yes Men said Mr Rastani was not connected with the group. However, The Yes Men's Mike Bonanno said Mr Rastani's comments were suspicious. He told the BBC: "People in power don't speak that way. But who cares if he is or is not real. What we should be paying attention to is our own surprise at hearing his words." Mr Bonanno said that "real industry insiders ... usually remain silent about the way the system works". "They obscure their actions with technical jargon and deceit. They have to, because when people find out what is going on - when it is spoken in plain language - there is widespread outrage."
ISOLATION AND CHARACTERISATION OF SUGARCANE RHIZOBACTERIA AND THEIR EFFECT ON NEMATODES A survey showed that species of Burkholderia were frequently associated with the roots of sugarcane. Based on amplified ribosomal DNA restriction analysis the isolates of these bacteria belonged to Burkholderia groups A,B,C,F and G. Data collected from field trials revealed a succession of species during the growth of the crop. Strains belonging to the B. cepacia complex were dominant during the early stages of growth but, in time, were replaced by isolates of the B. tropicalis and B. caribensis groups. In a series of in vitro bioassays it was found that a large proportion of the Burkholderia isolates tested paralysed juveniles of a root-knot nematode, Meloidogyne sp.
Comparative transcriptomic analyses of citrus cold-resistant vs. sensitive rootstocks might suggest a relevant role of ABA signaling in triggering cold scion adaption Background The citrus genus comprises a number of sensitive tropical and subtropical species to cold stress, which limits global citrus distribution to certain latitudes and causes major economic loss. We used RNA-Seq technology to analyze changes in the transcriptome of Valencia delta seedless orange in response to long-term cold stress grafted on two frequently used citrus rootstocks: Carrizo citrange (CAR), considered one of the most cold-tolerant accessions; C. macrophylla (MAC), a very sensitive one. Our objectives were to identify the genetic mechanism that produce the tolerant or sensitive phenotypes in citrus, as well as to gain insights of the rootstock-scion interactions that induce the cold tolerance or sensitivity in the scion. Results Plants were kept at 1 C for 30 days. Samples were taken at 0, 15 and 30 days. The metabolomic analysis showed a significant increase in the concentration of free sugars and proline, which was higher for the CAR plants. Hormone quantification in roots showed a substantially increased ABA concentration during cold exposure in the CAR roots, which was not observed in MAC. Different approaches were followed to analyze gene expression. During the stress treatment, the 0-15-day comparison yielded the most DEGs. The functional characterization of DEGs showed enrichment in GO terms and KEGG pathways related to abiotic stress responses previously described in plant cold adaption. The DEGs analysis revealed that several key genes promoting cold adaption were up-regulated in the CAR plants, and those repressing it had higher expression levels in the MAC samples. Conclusions The metabolomic and transcriptomic study herein performed indicates that the mechanisms activated in plants shortly after cold exposure remain active in the long term. Both the hormone quantification and differential expression analysis suggest that ABA signaling might play a relevant role in promoting the cold hardiness or sensitiveness of Valencia sweet orange grafted onto Carrizo citrange or Macrophylla rootstocks, respectively. Our work provides new insights into the mechanisms by which rootstocks modulate resistance to abiotic stress in the production variety grafted onto them. Supplementary Information The online version contains supplementary material available at 10.1186/s12870-022-03578-w. Primo-Capella et al. BMC Plant Biology 22:209 conditioning plants' geographical distribution in nature, which limits plant productivity in agriculture and threatens food security. The adverse effects of these abiotic stresses are exacerbated by climate change and global warming, which have been predicted to result in extreme weather episodes with intense rainfall, drought and rising temperatures, along with more frequent cold and hot waves. In order to improve plant adaptation to these changing environmental conditions, unraveling the mechanisms that allow plants to sense stress signals and to control plant response to abiotic stresses is a crucial step. Improving plant stress resistance is critical for not only agricultural productivity, but also for environmental sustainability because crops with poor stress resistance consume too much water and many fertilizers. The development of NGS technologies, particularly the analysis of the transcriptome with RNA-Seq, have allowed new insights into the genetic mechanisms by which plant species adapt to their environment under diverse conditions, including exposure to abiotic stresses. Low temperatures have a strong impact on the growth and geographical distribution of plants, and specially affect tropical crops. Plant adaptation to cold stress involves a large number of genes with minor additive effects, and a clear view of the different genetic regulatory pathways regulating responses to low temperatures has only recently been obtained in model plants. During cold acclimation, the expression of cold-responsive (COR) genes is activated in Arabidopsis by C-repeatbinding factors, namely genes CBF1, CBF2 and CBF3. Recent studies have elucidated the molecular mechanisms by which plants activate COR genes in response to cold signals and have shown that they are also regulated through CBF-independent pathways. CBFs are activated by MYC-like bHLH INDUCER OF CBF EXPRESSION 1 (ICE1). Protein ICE1's stability and, therefore its activity, in the nucleus, increase with the post-transcriptional sumoylation by SUMO E3 ligase SIZ1. During cold stress, stabilization by SIZ1 is counteracted by both ubiquitination and subsequent ICE1 degradation due to ring E3 ligase HOS1, which reaches the nucleus where ICE1 is located. Recent reports show that ICE1 is regulated by protein kinases: MPK (mitogen-activated protein kinase), MKK (mitogen-activated protein kinase kinase) and MEKK (mitogen-activated protein kinase kinase kinase). These protein kinases perform different functions and promote the degradation or activation of ICE1 and, consequently, the activation or repression of CBF genes. Besides ICE1, there are other transcription factors (TFs) independent of CBF-regulon. In fact, CBF-regulon only represents 12% of the transcriptome response in Arabidopsis thaliana at low temperature. Several of these TFs bind CBF promoters: LHY and CAMTA (involved in the circadian clock); HY5 (involved in light signaling); SOC1 (involved in flower development); MYB15 (negative regulator of CBF genes). Experiments run with single, double or triple cbf mutants in A. thaliana have demonstrated that not all COR genes are affected by CBF genes, and a CBF-independent pathway is involved in the regulation of cold response genes. Of the 27 first-wave transcription factors, HSFC1, ZAT12, and CZF1 regulate the expression of COR genes, but their expression is not affected in cbf triple mutants. The CBF-COR pathway functions in plants other than A. thaliana and CBF genes have been identified in a range of plant species like poplar, silver birch and several grasses. The conserved role of CBF genes has been confirmed in other species like apple, barley, potato and poplar because their overexpression enhances freezing tolerance, even without cold acclimation [23,27,. Light quality, circadian rhythm and photoperiod are other factors that regulate cold acclimation and affect the expression of CBF and COR genes. Light is necessary for increasing freezing tolerance, and is required for the expression of several cold-regulated genes (CORs) in different plant species: phytochromes activate COR15a and COR14b gene expression, and also regulate CBF target genes under red/far red light conditions. The circadian clock prepares plants for cold stress, and likely occurs at night: at warm temperature, the transcript levels for CBF1, CBF2, and CBF3 oscillate and peak at about 8 h after dawn (zeitgeber time 8; ZT8) with a trough at about ZT20. Moreover, the cold induction of CBF1, CBF2, and CBF3 is "gated" by the clock ; if plants are exposed to low temperature at ZT4, the increase in the CBF1, CBF2 and CBF3 transcript levels is much more marked than if plants are exposed to low temperature at ZT16. A correlation between plant winter hardiness and the accumulation of compatible solutes has been a well-known fact for decades and a growing number of studies indicate that sugars play an essential role in plant cold tolerance. Accumulation of soluble sugars, such as raffinose, which in Arabidopsis is involved in stabilizing PS II of cold-acclimated leaf cells, or trehalose, which confers high tolerance levels to different abiotic stresses, including low temperatures in rice plants, is associated with rapid starch degradation in chloroplasts, which would be induced by abscisic acid (ABA) in response to cold and to other abiotic stresses. Cold adaption is often associated with an increased level of this stress hormone, and a rising ABA level is thought to trigger a number of cellular responses required to develop freezing tolerance. Exogenous ABA application results in increased freezing tolerance in various higher plant species, including potato, wheat, winter rape and Arabidopsis, which suggests that ABA and cold might function in common physiological processes and lead to the development of enhanced freezing tolerance in plants. Citrus is one of the most economically important fruit tree crops in the world, with 148.7 million tons registered during the 2018/2019 season, which represents a 20.8% increase in the last 10 years and reflects constant annual growth. Climate is the most important factor for site selection in most citrus-growing regions around the world. Beyond certain limits, losses of tree crops or impaired fruit quality as a result of adverse weather conditions are so important that the production of any citrus fruit can be rendered impossible or unprofitable. Freezing temperatures are some of the most limiting factors for citrus production. Temperatures below 0C lead to the formation of intercellular ice crystals, which cause dehydration and damage trees. The threshold temperature that kills young citrus shoots is -12C, but it is common knowledge that citrus species' cold hardiness considerably varies. Rootstock/scion combinations are frequently employed in the citrus industry to improve fruit production and quality, as well as tolerance to biotic and abiotic stresses. Rootstocks are selected for root traits linked with resistance to pests and pathogens from soil, but also with several abiotic stresses, such as salinity, drought, floods and cold hardiness. One example of the importance of employing rootstocks is to control citrus tristeza virus (CTV). This virus devastated Spanish citriculture in the 1970s, but it was controlled thanks to the use of tolerant rootstocks. The influence of rootstocks on fruit quality-related traits has a proven significant effect on mandarin fruit size through cell size regulation, and also on tree growth, yield and quality, the leaf mineral composition of lemon, and even on the flavonoid content of lemon juice. A cold hardiness study conducted with a tetraploid Carrizo citrange rootstock showed enhanced natural chilling stress tolerance for common clementine. Carrizo citrange (CAR) is a widespread rootstock resulted from crossing sweet orange (Citrus sinensis Osb.) and Poncirus trifoliata (L. Raf.), a citrus-relative species that can withstand temperatures of -26°C when fully cold-acclimated. Several studies have shown that cold acclimation in Poncirus is mediated by the CBF regulon, with ICE1 acting as a central regulator of cold response. The PtrbHLH transcription factor, shown to confer cold tolerance to Poncirus, is able to promote enhanced cold tolerance when overexpressed in pummelo (Citrus grandis). On the other hand, Citrus macrophylla Wester (MAC), which belongs to the papeda group and is the most widespread rootstock in lemon orchards, is extremely sensitive to low temperatures. In this work, we used RNA-Seq technology to study how Valencia sweet orange responds differently to cold stress depending on the employed rootstock. We evidence that regulatory processes produce cold acclimation and suggest a mechanism by which Carrizo, a low-temperature resistant rootstock, triggers cold acclimation responses in the grafted scion. Total soluble sugars, starch and proline accumulate in response to cold treatment Total soluble sugars and starch were quantified by spectrophotometry methods after 0, 7, 15 and 30 days of cold treatment. A 2-fold increase in total soluble sugars ( Fig. 1a) was found in the C30 plants in relation to CAR0. However, the MAC plants only showed a slight increase in total soluble sugars, and no significant difference was found between M0 and M30. Starch content initially increased in the CAR samples on day 2, and was always higher in the cold-treated plants than in the controls (Fig. 1b). On the contrary, starch content slightly lowered in the MAC cold-treated plants, which had the highest concentrations on day 0 (Fig. 1b). Free proline was quantified by spectrophometric methods and on days 0, 1, 2, 4, 7, 15 and 30. As expected, proline concentration in leaves increased under cold conditions, and was generally higher in CAR than in MAC, and more significantly so at the end of the experiment with a more than 2-fold increase (Fig. 1c). A metabolomic profile that comprised some relevant primary metabolites was performed on days 0 and 30 under cold conditions ( Table 1). The levels of reduced sugars (rhamnose, trehalose, glucose) were higher on day 30 in both rootstocks, although the increase in CAR was greater than in MAC. The increase in fructose and raffinose was marked, especially raffinose, which was 300fold in CAR and 400-fold in MAC. For protein amino acids phenylalanine and proline and non-protein amino acid GABA quantification, no significant differences appeared at 30 days in relation to the control (Table 1). However, some differences were found for total proline when comparing both rootstocks, which was 2-fold higher in CAR than in MAC at 30 days. This confirmed the previous results shown above. ABA significantly accumulates in Carrizo roots The quantification of ABA and jasmonic acid (JA) hormones was carried out in roots and leaves at 0, 15 and 30 days (see Methods) and significant differences were found in the accumulation of these hormones in the different organs and rootstocks (Fig. 2). The ABA concentration in leaves lowered from 90 ng g -1 DW in M0 to 20 ng g -1 DW measured in M15 and M30, but remained constant in CAR throughout the experiment at a concentration of around 60 ng g -1 DW (Fig. 2a). The ABA concentration in roots remained constant at low levels (10 ng g -1 DW) in MAC, but increased 3-fold in CAR30. The total ABA concentration in the C30 roots was strikingly higher (40 ng g -1 DW) compared to M30 (5 ng g -1 DW). The ABA concentration was 8-fold higher in C30 (Fig. 2b). Initially the JA concentration was significantly higher in the MAC leaves (Fig. 2b), with similar levels in roots in both rootstocks (Fig. 2d). A drop in the amount of JA was observed in the two analyzed organs from C15 and M15. A lower JA concentration continued in C30, while a clear gain in M30 was noted in both roots and leaves. Macrophylla plants show damage as a result of cold treatment We analyzed the CAR and MAC plants to search for any effects caused by low temperatures after 30 days at 1 C. Although the Valencia shoots presented leaf damage in both rootstocks, differences were observed in young shoots, with much severer effects on the scions grafted onto the MAC rootstock caused by cold temperatures. In MAC plants, young shoots showed brown areas and dead leaves, while in the Carrizo ones they were completely normal (Fig. 3a). RNA-Seq overview RNA-Seq was carried out as described in the Methods section. Twelve pair-end libraries were constructed and sequenced with 200 bp reads. After quality trimming, 152.2 million read pairs were obtained with an average of 12.7 million reads pairs per sample, which accounted for 30.6 Gb of useful sequence. Reads were mapped against C. clementina 36454 transcripts, as described in the Methods section. Overall, more than 120 million reads (79%) were mapped against the reference transcripts, of which 77.8 million reads aligned 1 time and 42.2 million reads aligned many times, while 32.2 million reads did not map. The average number of reads per sample was 100 million reads, and the average number of reads mapped per gene was 274.3. The differential expression between Carrizo and Macrophylla is higher after 15D of cold treatment The differentially expressed genes (DEGs) between the Carrizo and Macrophylla samples were determined as described in methods Intraspecies gene expression analyses were performed by comparing samples from the same rootstock collected after 0, 15 and 30 days of cold treatment. Samples C15 and M15 were respectively compared to C0 and M0, and samples C30 and M30 were compared to C15 and M15. The earliest sample was taken (Fig. 3b). The RNA-seq results were validated by RT-PCR, which was carried out with some relevant DEGs selected from the intra-and interspecies comparative analyses (Additional File 1). In an attempt to identify transcripts involved in the cold tolerance of CAR, interspecies comparisons of the samples from each rootstock collected on the same date were made. Thus, samples C0, C15 and C30 were compared to MAC0, M15 and M30, respectively. The number of DEGs was much smaller than in the previous strategy but, as observed before, the largest number of DEGs was obtained from the 15-day samples comparison, while the analyses of the 0-and 30-day ones only showed 186 and 119 DEGs, respectively. The number of the up-and down-regulated transcripts was similar in all the samples (Fig. 3b). As the largest number of DEGs was obtained after 15 days of cold treatment, our analyses focused on these samples, when the response to low temperatures seemed to peak. The identity of the DEGs obtained by both approaches at 15 days was compared and the results are shown in a Venn diagram (Fig. 3c), where a high degree of overlapping is seen. This way, 9826 common differentially expressed transcripts were found in the intraspecific analyses (M0 and C0 vs. M15 and C15), which accounted for 77% and 70% of all the obtained DEGs, respectively. Similarly, 2104 differentially expressed transcripts from the interspecific comparison of samples M15 vs. C15 were also found in the intraspecific studies, and represents 87% of the identified DEGs. A core group of 872 (366 up-regulated, 506 down-regulated) transcripts was found to be differentially expressed in all three gene expression studies carried out with the 15-day samples. Functional annotation of DEGs reveals the main pathways associated with cold response The functional annotation of the differentially expressed gene set obtained from the intra-(0 vs. 15 D) and inter-(CAR vs. MAC) comparisons was performed using the BLAST2GO tool and an a functional enrichment analysis was performed with the FatiGO tool. GO terms with the lowest FDR scores that were overrepresented in at least two of the three comparisons were further analyzed ( Table 2). The most significantly enriched biological function was response to cold, with the lowest FDR score. Other overrepresented functional annotations were mobilization of sugars, regulation by ABA and JA hormones, or light and circadian cycles. We found an abundance of GO terms related to sugar metabolic pathways: response to sucrose, oligosaccharide biosynthetic process, fructose metabolic process, glucose metabolic process, sorbitol catabolic process, etc., Some enriched terms such as the raffinose family oligosaccharide biosynthetic process were present in the interspecific comparison, others like galactose catabolic process were obtained in the C0 vs. C15 analysis (see Table 2). For light response and the circadian clock, an overrepresentation of terms like photosynthesis, light harvesting in photosystem, protein-chromophore linkage, negative circadian rhythm regulation, far-red light signaling pathway and response to red light, was also observed. Several enriched biological processes were related to the hormone signaling: response to jasmonic acid, response to salicylic acid, ABA biosynthetic process, the auxin-activated signaling pathway, etc. Similar results were obtained when searching the KEGG database to determine the biological pathways represented among the differentially expressed loci at 15 days (Additional File 2). Clearly the carbohydrate metabolism was the most represented, with 512 loci involved. The reduced sugar metabolism pathways, such as those for sucrose, galactose or fructose, displayed the largest number of differentially expressed loci. Moreover, 263 genes were involved in the environmental adaptation, specifically with plant hormone signal transduction and circadian rhythm, which play crucial roles in the response to cold stress. Similarly, pathways for lipid metabolism with 230, biosynthesis of other secondary metabolites with 216, and amino acid metabolism with 129 genes were well-represented in the set of DEGs at 15 days. Analysis of the differentially expressed days at 15 days of cold treatment A study of 13338 genes, which showed a differential expression after 15 days under cold conditions (M0 vs. M15, C0 vs. M15, M15 vs. C15), was carried out to identify those involved in cold response in citrus and, more specifically, those that could be responsible for the different sensitivity to low temperatures displayed by the two rootstocks. As expected, a significant number of DEGs corresponded to the genes involved in the different pathways associated with cold response, such as the CBF regulon, light and circadian clock regulation, and metabolism of sugars. CBF regulatory pathway Several genes playing relevant roles in the CBF regulatory pathway were differentially expressed during cold treatment. They displayed different expression patterns, which are shown on the heat map in Fig. 4a. Three CBF/DRB-like genes were identified, DREB1B (LOC18049462), DREB1D (LOC18033409) and DREB2A (LOC18043521), which were overexpressed in samples C15 and M15 vs. C0 and M0. Interestingly, both foldchange (FC) and the normalized expression levels were generally higher in the Carrizo samples than in the Macrophylla ones, e.g., DREB1D FC in the M0-M15 comparison was 18-fold higher than in C0-C15, and its expression level was almost double. An ortholog of gene ICE1, LOC18051975, was also identified, which displayed a markedly reduced expression in both samples MAC and CAR. E3 ubiquitin-protein ligase COP1, coded by citrus gene LOC18050526 responsible for ICE1 degradation, showed a steadily decreased expression in both samples CAR and MAC during cold treatment. The SlZ1 citrus gene, LOC18045500, which mediates the SUMO conjugation of ICE1, was differentially expressed in intraspecies comparisons C0-C15 and M0-M15, and its expression on day 15 was significantly higher in the CAR samples. The closest citrus relative to the negative regulator of cold response, the SOC1 transcription factor located at gene locus LOC18046724, was down-regulated during cold treatment, while the promoter of the CBF genes expression MYB15 (LOC18032541) showed increasing expression levels during the same period. Downstream of the CBF genes, the citrus relatives of two genes rapidly induced under the cold conditions, ZAT10 (LOC18035015), ZAT11 (LOC18039334), and ZF (LOC18056156), which is involved in COR genes activation, were also differentially expressed with increasing expression levels during the low temperature treatment. A similar expression pattern was displayed by LOC18034062 and LOC18055900, the citrus equivalents of factors ZAT12 and HSFC1, which have also been described as COR genes activators, but independently of CBF activity. In this case however, the expression in the CAR samples was significantly higher than in the MAC ones. Protein DEAR1A, encoded in citrus by locus LOC18036471, belongs to this group, whose expression decreased in M0 vs. M15, which resulted in significantly higher levels of the transcript in C15 vs. M15. Light signaling and the circadian clock The DEGs included some key factors that integrate light and circadian clock regulation with cold response, as shown in Fig. 4b. We identified the citrus ortholog of phytochrome A (LOC18035337), which is an elicitor of the cold response. It was up-regulated only in the C15 samples, and phytochrome B (LOC18033413), which plays an antagonistic role, was down-regulated in all the 15D samples. Other relevant light regulation system members were found to be differentially expressed, such as phytochrome interacting factors PIF3 (LOC18034179), PIF4 (LOC18043439) and PIF7 (LOC18046731) with lowering expression levels. Citrus LHY gene LOC18047565 was down-regulated during the cold treatment period. However, its expression levels were always much higher in the CAR samples compared to the MAC ones. For example, samples C15 had 4716.40 TMMs, and samples M15 had 1445.42 TMMs. We also identified a citrus HY5 ortholog gene, LOC18054790, which showed a slight expression increase in the C15 samples, while remaining unchanged in M15. However, the amount of the HY5 transcripts was 4-fold higher in the C15 samples compared to M15, and almost 3-fold in C30 vs. M30. Calcium and MAP kinase signaling cascades The heat map in Fig. 4c shows several mitogen-activated protein kinases among DEGs MEKK1 (LOC18040613), MEKK5 (LOC112100736) and MKK6 (LOC18056166), whose expression lowered in M15 and C15 vs. M0 and C0. We also identified two orthologs of calcium/calmodulin-regulated receptor-like kinases CRLK1 and CRLK2, the citrus genes at LOC18054581 and LOC18039195, which had increased expression levels in the 15D samples, but were significantly higher in the CAR ones. Sugars and amino acids metabolism According to the changes observed in the accumulation of sugars, dozens of genes related to their metabolism were found to be differentially expressed during cold treatment. The most significant results are shown in the heat map in Fig. 5a. Our analyses also revealed fructose accumulation during cold treatment, and two possible orthologs of enzyme fructose-1,6-bisphosphate aldolase (FBA), which is a key enzyme in sugar metabolisms, but is also involved in the response to abiotic stresses. Two citrus FBA genes were differentially expressed: cytosolic LOC18040008 and chloroplastic LOC18034756. Chloroplastic FBA had similar expression values in samples CAR and MAC, while the cytosolic one displayed a marked expression increase with FCs of 3.15 and 5.02 in the C0-C15 and M0-M15 comparisons, respectively. Gene BAM1 (LOC18048130), coding for -amylase, which was identified based on its sequence identity with PtrBAM1 from Poncirus trifoliata, was also differentially expressed, and showed a reduced expression at 15 days for samples CAR and MAC. As the most significant change in amino acid quantification was proline accumulation, the expression patterns of the genes involved in proline synthesis were analyzed in detail. They are shown as a heat map in Fig. 5a. Flavonoids and anthocyanins synthesis A number of differentially expressed genes were related to the synthesis of anthocyanins like the anthocyanidin 3-O-glucosyltransferases (LOC18054166, LOC18047244), and flavonoids like chalcone-flavonone isomerase 3 (LOC18044429), chalcone synthases TRANSPARENT TESTA 4 (LOC18042808) and 7 (LOC18051925), flavonoid 3'-monooxygenase (LOC18050323), and flavonol synthase/flavanone 3-hydroxylase (LOC18037475). All them showed a consistently increased expression in the C15 samples and C30 vs. the M15 and M30 ones (Fig. 5b). We also identified a putative MYB111 gene LOC18031574 that's has been shown to promote the synthesis of anthocyanins and flavonoids, that was up-regulated during cold treatment and was overexpressed in CAR samples. With respect GA, we found two differentially expressed loci coding for GA 2-beta-dioxygenase (GA2OX), LOC18050091) and LOC18049951, that were up-regulated in samples C15 and M15 in relation to C0 and M0 (Fig. 6a). For Brassinosteroids, the orthologs of CONSTITU-TIVE PHOTOMORPHOGENESIS AND DWARF-ISM, CPD (LOC18044077) and DWF4 (LOC18037059) were identified, which were down-regulated from 0 to 15 days, although expression levels were higher in the CAR samples, especially in DWF4. On the other hand, LOC18041991, coding for BRASSINOSTEROID INSEN-SITIVE 1-associated receptor kinase 1 protein (Brl1), was overexpressed at 15 days compared to 0 days, and in samples CAR vs. samples MAC (Fig. 6a). Finally, many genes involved in ABA synthesis and transport, as well as those coding for the receptors of the hormone, were differentially expressed in our samples (Fig. 6c). A clearl expression increase was observed observed in C15 samples genes coding forABA receptors belonging to the PYL family (PYL 8/LOC18048127, PYL9/ LOC18049819), as well as for ABA signaling integration, such as serine/threonine-protein kinase SRK2E Discussion An RNA-Seq analysis was performed to investigate the genetic regulation of the response in citrus after exposure to low temperatures. Similar studies have been previously carried out in many species, including the citrus trifoliate orange Poncirus trifoliata. However, our work provides novel approaches that can provide new insights into the process. The plants in our experiment were subject to long-term low temperature treatment with samples taken on days 0, 15 and 30, while most works base their results on short low temperature exposures lasting up to 72 h. In this way, our results offer new data on the plant long-term response to cold by showing gene expression variation over a 30-day period. We also measured the effects of low temperature on the commercial citrus variety sweet orange grafted onto two different rootstocks: Carrizo citrange, reported to be cold-tolerant, and Citrus macrophylla, one of the most sensitive to chilling citrus species. As we subjected the same scion to the same environmental conditions, the different obtained response could only be due to the different rootstock onto which it was grafted. Therefore, our analysis also provides relevant data about how rootstocks transfer their characters to scions by, in this case, providing resistance or sensitivity to low temperature. Physiological and transcriptomic data show relevant differences between sensitive and resistant rootstock samples Our comprehensive comparative analysis of the physiological and transcriptomic changes that occurred in the sweet orange grafted onto a cold-sensitive citrus rootstock (MAC) and a resistant one (CAR) revealed marked differences between them. Our data indicate a possible correlation between changes in the accumulation of metabolites, hormone signaling and gene expression, caused by cold treatment, which might explain the sensitive and resistant phenotypes provided by rootstocks CAR and MAC. The functional analysis of DEGs showed that all the previously described biological processes and pathways activated by cold response also worked in citrus. The biological process with the lowest FDR score was a significant response to cold, and other enriched terms and pathways included sugar metabolism, CBF regulon, light response, and hormonal regulatory pathways, all of which play essential roles in plants' cold tolerance regulation. Some enriched functions, such as response to chitin and Karrikin, which are related to defense against pathogens and are apparently unrelated to cold stress, have been recently associated with a response to biotic stresses, which evidences the usefulness of enrichment analyses to reveal the new biological and molecular functions involved in a given biological process. The large number of DEGs found in this RNA-seq study, as well as the many distinct biological processes in which they are involved, show the complexity of the response to low temperatures, which has been reported by many previous works. Our results are similar to those obtained in an analogous study performed in trifoliate orange (Poncirus trifoliata). Both works reveal a large number of cold-responsive genes, including those encoding TFs, hormone signaling elements, and enzymes associated with the synthesis of protective metabolites. The trifoliate orange transcriptome was analyzed after 72 h of cold treatment. In our experiment, samples were exposed to low temperature for 30 days. Thus our results indicate that the defense mechanisms shortly activated by cold stress, remain active after 1 month of cold exposure. Most of the key genes involved in distinct cold response mechanisms were found to be differentially expressed in the 15D samples. However, relevant differences were found between both rootstocks. In Carrizo, 12691 DEGs were found in the C0-C15 comparison, but only 409 in the C15-C30 one, and none were relevant to cold adaption. The M0-M15 comparison yielded 14199 DEGs, while that of M15-M30 gave 4391, 10-fold the number of DEGs obtained for CAR and included many of the key regulatory genes involved in cold response like SOC1, APP3 and 7, LHY, ZAT11, ABA1 and 2, ABCG40, NRT4, PP2C 24 and 75, TIFY 4B, or MYC2. None of these regulatory genes displayed expression changes in C15 vs. C30. These data suggest that at 15 days, the Carrizo plants were fully adapted to low temperature exposure, while the Macrophylla ones kept struggling to do so with the expression of thousands of genes already adjusting to the adverse environment. Similar findings have been reported in Prunus persica, where major differences in the transcriptome were observed in peach chilling injurysensitive and non-sensitive varieties, with quite a high degree of divergence between cultivars' gene expression patterns. Distinct expression profiles in response to cold have also found in potato when comparing the dynamic cold-response transcriptome of a sweetening resistant genotype to a sensitive one. In short, the overview of the physiological and transcriptomic data revealed the crucial regulatory pathways involved in response to low temperatures and abiotic stresses, which are discussed in the next sections. A summary of the differentially expressed relevant genes, as well as the pathways involved in the response to low temperature, are shown in Fig. 7. Accumulation of free sugars is higher in CAR and correlates with differences in gene expression Considerable pieces of evidence demonstrate that sugar functions as an osmotic substance, a membrane stabilizer, and can be an antioxidant in cold stress responses. The freezing tolerance level, and the accumulation of soluble sugars like sucrose, glucose and fructose, parallel one another in different photoperiods during cold acclimation. This scenario indicates that sugar accumulation is a fundamental component of enhanced freezing tolerance. Our analyses showed a clear and continuous increase in the total soluble sugars in the resistant Carrizo samples (Fig. 1), which was not so evident in the sensitive Macrophylla ones. The raffinose concentration increased 336fold in the CAR plants and 299-fold in the MAC plants, which was by far the largest increase. The raffinose family of oligosaccharides has been found to accumulate in different abiotic stresses like drought or cold, and galactinol and raffinose levels correlate with increased tolerance to salinity and chilling in paraquat. Longterm cold stress considerably increases galactinol and raffinose contents in rice. Raffinose sugar is found in Fig. 7 Summary figure including the most important DEGs and pathways that changed during long-term cold stress in citrus grafted plants. DEGs in green color represents genes that increased its expression and DEGs in red color represents genes that decreased its expression the chloroplast and seems to play an important role by stabilizing photosystem II in A. thaliana under freezing conditions. Raffinose synthesis alone is not sufficient for inducing cold tolerance or cold acclimation in A. thaliana, but plants are much better protected from oxidative damage when the raffinose concentration rises. This sharp increment in the raffinose concentration correlates with the significantly increased expression of galactinol synthase and raffinose synthase genes, both involved in raffinose biosynthesis. Similar results and an increase in the transcript levels of the genes related to galactinol and raffinose synthesis preceding the accumulation of raffinose have been detected and found in rice and Arabidopsis. Therefore, the greater accumulation of free sugars, specially raffinose and fructose, in resistant CAR plants would increase adaption to low temperatures, while the sensitive MAC ones would be more exposed to damage caused by cold. Accumulation of soluble sugars is usually accompanied by starch degradation in many plant species, including grasses, such as barley, or woody plants like lychee tree or poplar. However, in our study, the starch concentration did not lower in the MAC plants, and even increased in the CAR samples, which correlates with the downregulation of the expression of the citrus -amylase gene (LOC18048130) observed in all the samples. These results contrast with those obtained in trifoliate orange which, under cold stress, showed considerably more Ptr-BAM3BAM activity and greater accumulation of maltose and soluble sugars. These data suggest that -amylase is a member of -CBF regulon in P. trifoliataa, and plays an important role in cold tolerance by modulating the levels of soluble sugars. Nevertheless, several studies have also reported increased starch accumulation in plants under cold stress. In tomato (Solanum lycopersicum), cold-activated starch accumulation in sensitive tomato species and starch content in leaves were 4-to 5-fold higher in cold-sensitive Solanum spp., but remained unchanged in three cold-adapted ones. Furthermore, sugar levels and photosynthesis did not change in tolerant species, but either increased or decreased in susceptible ones. Similarly, cold treatment in A. thaliana increased both starch content and the accumulation of the major starch-degrading product, maltose. The reasons for these discrepancies remain unclear and the response might depend on the exact experimental conditions, the involved haplotypes, etc.. Therefore, obtaining a more consistent view of changes in starch metabolism during cold stress in citrus would require running experiments under similar conditions to make direct comparisons between species and different cold treatments. Proline accumulation Free proline accumulation as a plant response to abiotic stresses has been repeatedly reported, with a direct relation existing between proline accumulation and plant stress tolerance. Our results agree with this observation as significant proline accumulation was found in our samples during cold treatment. Furthermore, a differential 2-fold proline accumulation was observed in the resistant CAR plants vs. the sensitive MAC ones. Likewise, the genes involved in proline synthesis, such as P5CS, PDH1, and P5CDH, displayed a significantly increased expression in the CAR samples compared to the MAC ones. Our data corroborate the results obtained in a long-term cold stress experiment performed in Carrizo citrange, which reported an increment in both proline concentration and proline synthesis gene expression levels. An evaluation of proline levels during the low temperature acclimation of several Citrus species has shown that accumulated free proline was 3-to 6-fold higher in cold acclimated than in non-acclimated tissues, and higher proline significantly correlated with plants' ability to survive freeze damage. More recently, studies into cold stress tolerance in chickpea and pine trees compared cold-tolerant and cold-sensitive cultivars to show an increase in proline in leaves exposed to cold stress in both species. Hence we evidence a differential accumulation of total free sugars and proline in the CAR plants, which would contribute to protect them from cold damage. Gene regulation of the cold response Our RNA-seq study identified most of the key regulatory factors of cold response in plants. As expected, the genes involved in the MAP kinase, light, CBF and hormone signaling pathways were differentially expressed from 0 to 15 days, which indicates that plants had adapted to low temperature exposure. These pathways have been extensively analyzed in many species, including trifoliate orange, and our results very much agree with these previous results. Thus we focused the discussion on the differences between the CAR and MAC samples to gain some insight into the origin of cold sensitiveness or hardiness in citrus species. Intraspecies comparisons C0/C15 and M0/M15 evidenced that most of the genes acting as promoters of cold adaption were up-regulated, while those acting as repressors were down-regulated in the 15-day samples vs. the 0-day ones. Alternatively, the interspecies expression analysis, M15/C15, showed that a relevant number of the genes that acted as promoters displayed significantly higher expression levels in C15 in relation to M15, and, on the contrary, those genes repressing cold adaption had lower expression levels. The final expression balance would favor cold acclimation in the resistant Carrizo plants, which would diminish in the sensitive Macrophylla ones. There are many pieces of evidence for the critical role of CBF transcription factors in cold acclimation. So the up-regulation in all the 15D samples of the three CBF citrus orthologs, DREB1B, DREB1D and DREB2A, was not unexpected. Remarkably, the FC and transcript levels were much higher in the C15 than in the M15 samples. It has been reported that the up-regulation of CBF genes enhances cold hardiness in apple, barley, potato and poplar. These results could coincide with ours as the up-regulation of the CBF genes in Carrizo vs. Macrophylla would correlate with the hardiness or sensitivity displayed by each rootstock. Transcription factor ICE1 regulates the expression of CBF genes under cold conditions, is expressed constitutively in Arabidopsis, and its overexpression in wildtype plants enhances the expression of the CBF regulon and improves freezing tolerance. The citrus ICE1 gene that we identified, LOC18051975, was identical to PtrbHLH from Poncirus trifoliata. The expression pattern of PtrbHLH under cold stress reached the highest level at 48 h, with a drop at 72 h which could not explain its severe expression reduction in the 15-day samples vs. controls. Further research is required with samples taken at shorter intervals to understand the dynamics of IC1 expression during long-term cold stress. However, ICE1 activity regulation takes place at the protein level, and ICE1 levels are modulated by ubiquitylation, which leads to its degradation, while sumoylation stabilizes ICE1 by preventing ubiquitylation. Thus it was interesting to find that citrus SlZ1, coding for an E3 SUMO-protein ligase, was overexpressed in the 15-day and 30-day samples and, once again, its expression level was much higher in the C15 samples than in the M15 ones. Higher levels of SlZ1 transcripts would promote ICE1 protein stabilization, which would lead to the up-regulation of CBF genes in Carrizo plants. Calmodulin-binding transcription activator (CAMTA) transcription factors are positive regulators of CBFs, of which CAMTA2 and CAMTA 3 have been reported to directly bind to the CBF2 promoter. Once again, citrus genes LOC18034255 and LOC18031678, respectively coding for CAMTA2 and CAMTA 3, were more up-regulated in the 15-day samples, although no significant differences could be found for CAMTA3 between C15 and M15, and the expression level for CAMTA2 was even higher in M15. Cold-induced CBF expression can be affected by light quality, the circadian clock and photoperiod. It is known that the late elongated hypocotyl (LHY) protein directly binds to CBF promoters and positively regulates CBF expression. In our experiment, the expression levels of the citrus LHY gene (LOC18047565) were always much higher in the CAR samples than in the MAC ones (4716,40 TMMs in C15 vs. 1445,42 TMMs in M15). Higher LHY levels in Carrizo would lead, as seen before, to a higher up-regulation of CBF genes to improve cold resistance in this rootstock. In addition to their roles in plant photomorphogenesis, cross-talk between phytochrome-mediated light signals and cold signaling pathways has been identified in Arabidopsis and tomato. PHYA and PHYE play antagonistic roles, and they positively and negatively regulate cold tolerance, respectively. Citrus loci PHYA/ LOC18035337 and PHYE/LOC18053186 display expression patterns which agree with their roles, thus PHYA is up-regulated and PHYE is down-regulated during cold treatment. As we describe before, PHYA levels were significantly higher in the CAR samples, which would once again favor enhanced cold acclimation in Carrizo plants. On the contrary, the PHYE expression level was higher in the MAC samples, which would produce a stronger repression of cold acclimation and would contribute to the cold-sensitive phenotype of Macrophylla plants. Mitogen-activated protein kinase cascades play a relevant role in plants' cold adaption, and typically comprises three protein kinases, MEKK, MKK, and MPK, which serially act. The MKK4/5-MPK3/6 pathway promotes ICE1 degradation and the repression of CBF genes. In our study, the citrus gene coding for MKK5, LOC112100736, was down-regulated, and the expression levels in the MAC samples were at least 2-fold those found in the CAR ones. Hence in Macrophylla plants, greater MKK5 activity would lower ICE1 levels to cause more marked CBF genes repression, which would agree with Macrophylla cold sensitivity. The citrus HY5 gene, LOC18054790, showed a slightly increased expression in the C15 samples, but remained unchanged in M15. However, the amount of the HY5 transcripts was 4-fold higher in the C15 samples compared to M15, and almost 3-fold in C30 in relation to M30. bZIP transcription factor HY5 is a hub of both cold response and photomorphogenesis that positively regulates cold-induced gene expression to ensure complete cold acclimation development. HY5 mediates plant responses, such as anthocyanin biosynthesis or hormonal responses by ABA, GAs, cytokinin (CK) and auxins. The role of HY5 in cold acclimation is crucial, as shown in A. thaliana, for controlling the induction of nearly 10% of all cold-inducible genes. Our results suggest that the up-regulated HY5 expression would boost the cold adaption process in Carrizo plants, while its down-regulation would make the response to low temperatures in Macrophylla plants difficult. Cold response induces the synthesis of flavonoids and anthocyanins, that enhance cold acclimation. Anthocyanins are photoprotective agents that shade and protect the photosynthetic apparatus by absorbing excess visible UV light and scavenging radicals in different abiotic stresses. For example, rich anthocyanins red pear fruit (cv. Anjou) and purple pepper leaves (cv. Huai Zi) show stabler PS II photosynthetic capacity and greater photo-oxidation tolerance compared to non anthocyanin tissues. Our results indicate that the biosynthesis of anthocyanin and flavonoids would be significantly more active in Carrizo, which would be able to reduce ROS damage more efficiently than Macrophylla, where ROS damage would be more difficult to handle, which could be another possible cause of the Macrophylla cold-sensitive phenotype. Hormonal regulation and rootstock/scion interactions Plant hormones function as governors of signal events in cold stress responses by displaying synergistic or antagonistic effects on the biosynthesis and signaling outputs of other hormones to create a complex network of hormonal interactions. A transcriptomic study in Arabidopsis has shown that cold and light can induce the expression of the genes related to ABA biosynthesis to prepare plants for cold acclimation. In tomato, it has been proposed that the PHYA level increases at cold temperatures, which results in the up-regulation of HY5. HY5 promotes ABA biosynthesis and GA catabolism by generating a low GA/ABA ratio, capable of stopping plant growth and promoting cold tolerance. Growth-promoting hormone gibberellin (GA) is targeted by cold stress, which results in a reduction in bioactive GA that suppresses growth and late flowering. Increased expressions of citrus genes GA2OX 1/LOC18050091 and GA2OX 2/LOC18049951 during our experiment would lower GA levels by causing plant growth arrest and promoting cold adaption. However, no significant differences were found between samples CAR and MAC. Brassinosteroids also promote plant growth but, unlike GA, seem to positively control cold stress responses. Our results show an overexpression of BrI1, CPD and DWF4 in the C15 samples, that would significantly increase BRs synthesis in Carrizo, promoting cold adaption and contributing to its chilling resistance phenotype. Jasmonic acid is an oxylipin whose levels increase under cold stress in different plant species like rice and Arabidopsis, which correlates with a higher expression of JA biosynthetic genes. The up-regulation of the genes involved in JA signaling, such as Cullin1, RING1, SKP1 1 family and MYC2, together with the down-regulation of AFPH2/NINJA and TIFY 4B, repressors of JA signaling, suggests the relevant role of JA in cold adaption in our experiment. However, the gene expression data did not correlate with the JA quantification in leaves and roots, and the amount of JA observed in the C15 and M15 samples lowered. Further research must be performed to find the reason for the uncoupling of gene expression and JA accumulation in this experiment. Abscisic acid is a central regulator of cold stress signaling with emerging roles in the CBF dependent pathway. In many plant species, higher ABA levels that correlate with increased ABA biosynthesis take place in response to cold, and ABA mutants display altered cold resistance. ABA acts as a long-distance transporter signal that mediates root-to-shoot communication under stress conditions. In our study, many of the genes involved in ABA synthesis and transport, and those coding for the receptors of this hormone, were differentially expressed. The genes involved in ABA biosynthesis were down-regulated in M15 and C15 compared to M0 and C0, which indicates that ABA production in leaves decreased. Previous reports state that cold stress in Arabidopsis modifies ABA biosynthesis and catabolism, and also affects ABA transport and homeostasis. The down-regulation of the genes involved in ABA biosynthesis in the M15 samples correlated with the lower ABA concentration in Macrophylla plant leaves. However, ABA concentration remained constant in samples C15 despite the down-regulation of the genes involved in ABA biosynthesis that also took place. One possible reason for this result could be the rise in ABA concentration observed in Carrizo plant roots, which was significantly higher at 15 and 30 cold days than in the MAC plants, where no increase was observed (Fig. 2). The ABA quantification results would agree with the transcript expression patterns: genes coding for the NRT1/PTR and ABCG families of ABA transporters were up-regulated in the 15-day samples. Moreover, the expression of the genes coding for the ABA receptors PYL family, and for ABA signaling integration, like SRK2 E kinase or the protein phosphatase PP2C family, clearly increased in the C15 samples, with significantly higher normalized expression values (TMMs) in C15 than in M15. These data suggest that the large amount of ABA produced in Carrizo roots could be transported to leaves so that its concentration would remain constant, and would allow the activation and promotion of the ABA signaling response to cold in CAR plants during the 30-day low temperature treatment. This would not happen in Macrophylla because no increase in ABA concentration took place. Hence lack of ABA signaling would lead to a cold sensitive phenotype. These data suggest that the different sensitivity to low temperatures shown by the sweet orange scion could be transferred by the respective resistant or sensitive rootstock onto which it was grafted. The influence of rootstocks on many scion biology aspects is well-established, and molecular aspects of root-to-shoot and/or shoot-toroot signaling events are starting to be known and show how grafting triggers differential responses between the scion and rootstock. Grafting is widespread to improve plant performance in terms of yield, quality and resilience to abiotic and biotic stresses,. The use of tolerant rootstocks to different abiotic stresses, such as drought, salinity, drastically rising or decreasing temperature, etc., is becoming indispensable in this global climate change era. Highly tolerant rootstocks have been shown to improve cold tolerance in eggplant, tomato and cucumber. In citrus, a tetraploid Carrizo citrange rootstock enhances the natural chilling stress tolerance of the common clementine, C. clementina, which reveals that the ability of Carrizo citrange to promote cold tolerance does not depend on the species it is grafted onto as we obtained similar results in our experiments with sweet orange, C. sinensis, employed as the scion. In another study, the impact of grafting on the cold responsive gene expression in Satsuma mandarin, Citrus unshiu, has been addressed by real-time RT-PCR following exposure to cold shock or cold acclimation treatments. P. trifoliate, one of the Carrizo Citrange parentals, was used as a rootstock. The Poncirus rootstock significantly affected the gene expression in the C. unshiu scion, and major expression changes were detected during cold stress in different species, which would agree with the results that we obtained with the whole transcriptome in sweet orange. Conclusions This work shows the results of studying the long-term effect of cold exposure on citrus plants. We provide metabolomic and transcriptome data, which reveal how the response mechanisms activated by cold remain active after 30 days of low temperature exposure. By analyzing the transcriptomic changes in samples from a citrus commercial variety, sweet orange, grafted onto two rootstocks with different sensitivities to low temperatures, we demonstrate the extent to which rootstocks are able to modify the scion response to abiotic stress. The obtained results allowed us to propose a mechanism by which the resistant rootstock might promote cold hardiness in the scion via ABA signaling, which would up-regulate the regulatory pathways involved in cold adaptation. However, more research will be required to confirm the role of ABA signaling in cold tolerance in citrus, as well as to describe in detail the scion-rootstock interactions that lead to the different responses to cold in the scion. Soluble sugars and starch quantification Soluble sugars and starch were measured biweekly in the lyophilized and milled leaves (100 mg DW). Soluble sugars and starch were analyzed by a colorimetric method based on. Samples were mixed with heated ethanol and centrifuged. The liquid part contained sugars and the precipitate contained starch. For the sugars and starch determinations, anthrone-acid solution was added and samples were placed in a boiling water bath. The results were read at 630 nm (Lambda 25, PerkinElmer, Shelton, CT, USA). Primary metabolites The primary metabolite analysis was performed at the Instituto de Biologa Molecular y Celular de Plantas (UPV-CSIC, Valencia, Spain) on the Metabolomics Platform by a modified method of that described by Roessner et al : 100 mg of leaves per sample were homogenized with liquid nitrogen and extracted in 1400 L of 100% methanol supplemented with 60 L of internal standard (0.2 mg ribitol in 1 mL of water) for 15 min at 70°C. The extract was centrifuged for 10 minutes at 14000 rpms. The supernatant was transferred to a glass vial and 750 L of ChCl 3 and 1500 l of water were added. The mixture was vortexed for 15 sec and centrifuged for 15 min at 14000 rpms. Then 150-L aliquots of the methanol/water supernatant upper phase were dried in a vacuum for 6-16 h. For derivatization purpose, the dry residues were dissolved in 40 L of 20 mg/mL of methoxyamine hydrochloride in pyridine and incubated for 90 min at 37C, followed by the addition of 70 L of MSTFA (N-methyl-N-trifluoroacetamide) and 6 L of a retention time standard mixture (3.7% mix of fatty acid methyl esters ranging from 8C to 24C) with further incubation for 30 min at 37C. Sample volumes (2 L) were injected in the split and splitless modes to increase the metabolite detection range in a 6890 N gas chromatograph (Agilent Technologies Inc. Santa Clara, CA, USA) coupled to a Pegasus 4D TOF mass spectrometer (LECO, St. Joseph, MI, USA). Gas chromatography was performed in a BPX35 (30 m 0.32 mm 0.25 m) column (SGE Analytical Science Pty Ltd., Australia) with helium as the carrier gas at constant flow: 2 ml/min. The liner was set at 230C. The oven program was 85C for 2 min, 8C/min ramp until 360C. Mass spectra were collected at 6.25 spectra s−L within the 35-900 m/z range and ionization energy of 70 eV. Chromatograms and mass spectra were evaluated by the CHROMATOF program (LECO, St. Joseph, MI, USA). Hormone quantification The quantification of ABA and JA has been carried out by Seo, Jikumaru and Kamiya following the protocol quantification in lyophilized and milled leaves at 0, 15 and 30 cold days. In short, the material (about 100 mg of dry leaves and roots) from 0, 15 and 30 cold days was suspended in 80% methanol-1% acetic acid containing internal standards, and was mixed by shaking for 1 h at 4C. The extract was kept at -20C overnight and then centrifuged. The supernatant was dried in a vacuum evaporator. The dry residue was dissolved in 1% acetic acid and passed through an Oasis HLB (reverse phase). For the ABA and JA quantifications, the dried eluate was dissolved in 5% acetonitrile-1% acetic acid, and hormones were separated using an autosampler and reverse phase UHPLC chromatography (2.6 m Accucore RP-MS column, 50 mm length x 2.1 mm i.d.; Thermo Fisher Scientific) with a 5-50% acetonitrile gradient containing 0.05% acetic acid at 400 L/min for 14 min. Hormones were analyzed in a Q-Exactive mass spectrometer (Orbitrap detector; Thermo Fisher Scientific) by targeted Selected Ion Monitoring (SIM). The concentrations of hormones in extracts were determined using embedded calibration curves and the Xcalibur 4.0 and TraceFinder 4.1 SP1 programs. The internal standard for the ABA and JA plant hormone quantification was the deuterium-labeled hormone ( 2 H6-ABA) and dhJA, respectively. Statistical analyses For the statistical analyses, all the resulting values were the mean of six independent plants per treatment. The RT-PCR values were the mean and standard deviation of three biological replicates and three technical replicates per plant. Data were submitted to an analysis of variance (multifactor ANOVA) using Statgraphics Centurion, version 16.1 (Statistical Graphics, Englewood Cliffs, NJ, USA) prior to testing for normality and homogeneity. When the ANOVA showed a statistical effect, means were separated by least significant differences (LSD) at P < 0.05. RNA isolation Leaf samples were obtained biweekly, collected in liquid N 2 and stored at -80C. Total RNA was isolated from 100 mg of plant tissue using the RNeasy Plant Mini Kit (Qiagen) with RLT--mercaptoethanol (Sigma-Aldrich) buffer. Contaminant genomic DNA was removed with the RNase-Free DNase Set (Qiagen, CA, USA) by on-column digestion according to the manufacturer's instructions. Purified RNA (2 g) was reverse-transcribed with SuperScript ® III Reverse Transcriptase (RT) (Life Technologies, Carlsbad, CA, USA) in a total volume of 10 L. First-strand cDNA was 50-fold diluted and 2 L were used as a template for the quantitative real-time RT-PCR in a final volume of 20 L. Quantitative real-time PCR was performed in a StepOnePlus Real-Time PCR System (Life Technologies, Carlsbad, CA, USA) using TB Greenpremix Ex Taq (TliRNaseH plus) (Takara Europe, S.A.S, Saint Germain en Laye, France). The PCR protocol consisted of 10 min at 95C, followed by 40 cycles of 15 sec at 95C, and 1 min at 60C. The specificity of the reaction was assessed by the presence of a single peak on the dissociation curve and through the size estimations of the amplified product by agarose electrophoresis. All the specific primers (Table 3) were tested before PCR reaction and an efficiency=2 was obtained. The CiclevActin and CiclevUBC4 transcripts, amplified with specific primers, were used as reference genes, and a single-factor ANOVA and linear regression analyses of the CT values were performed to examine the variation in our reference genes. The normalization factor of the reference genes was calculated by the geometric mean of the values of both genes. The relative expression was measured by the relative standard curve procedure with 5 points of dilutions. The results were the average of three independent biological replicates with three technical replicates per biological sample. RNA sequencing library preparation and sequencing Total RNA from leaves was used for library construction. The Poly(A)+ mRNA fraction was isolated from total RNA and the cDNA libraries were obtained following Illuminas recommendations. Briefly, poly(A)+ RNA was isolated on poly-T oligo-attached magnetic beads and chemically fragmented prior to reverse transcription and cDNA generation. The cDNA fragments then underwent an end repair process, a single ' A' base was added to the 3' end, followed by the ligation of adapters. Finally, products were purified and enriched with PCR to create the indexed final double-stranded cDNA library. The quality of libraries was analyzed by a Bioanalyzer 2100, a High Sensitivity assay. The quantity of libraries was determined by real-time PCR in a LightCycler 480 (Roche). The pool of libraries was sequenced by pairedend sequencing (100 x 2) in an Illumina HiSeq 2500 sequencer. RNA-Seq and differential expression analyses RNA-Seq fastq files were pre-processed with Trimmomatic 0.38, and the reads with an average quality smaller than 25 and shorter than 36 bases were filtered as implemented in the OmicsBox suit. The transcriptome of what was obtained was used as a reference for mapping. The transcript-level quantification tool was estimated from the RNA-Seq reads (FASTQ) using the RSEM software package, which allocates multimapping reads among transcripts by an expectation maximization approach based on a fast gapped-read alignment with Bowtie 2. The reference transcript sequences were obtained from the NCBI Citrus clementina Annotation Release 100 (GCF_000493195.1). Read counts were normalized with the Trimmed mean of the M values (TMM) method. Pairwise differential expression analyses were carried out with the EdgeR package 3.28.0 using the GLM (Quasi Likeli hood F-Test) with 0.05 taken as the cut-off value. Gene set Enrichment of GO terms for the differentially expressed gene was carried out using FatiGO, which employs Fisher's Exact Test for the statistical assessment of any annotation differences between two sets of sequences. Expression heat maps were constructed with Clust-Vis. The original TMM values were ln(x)-transformed, and rows were clustered using correlation distance and average linkage.
package org.spoofax.jsglr2.parser; import java.util.HashMap; import java.util.Map; import java.util.stream.Collectors; import java.util.stream.Stream; import org.metaborg.parsetable.IParseTable; import org.spoofax.jsglr2.composite.CompositeParseForestManager; import org.spoofax.jsglr2.composite.CompositeReduceManager; import org.spoofax.jsglr2.datadependent.DataDependentParseForestManager; import org.spoofax.jsglr2.datadependent.DataDependentReduceManager; import org.spoofax.jsglr2.elkhound.BasicElkhoundStackManager; import org.spoofax.jsglr2.elkhound.ElkhoundParser; import org.spoofax.jsglr2.elkhound.ElkhoundReduceManager; import org.spoofax.jsglr2.elkhound.HybridElkhoundStackManager; import org.spoofax.jsglr2.incremental.IncrementalParseState; import org.spoofax.jsglr2.incremental.IncrementalParser; import org.spoofax.jsglr2.incremental.IncrementalReduceManager; import org.spoofax.jsglr2.incremental.diff.IStringDiff; import org.spoofax.jsglr2.incremental.diff.JGitHistogramDiff; import org.spoofax.jsglr2.incremental.parseforest.IncrementalParseForestManager; import org.spoofax.jsglr2.inputstack.IInputStack; import org.spoofax.jsglr2.inputstack.InputStack; import org.spoofax.jsglr2.inputstack.InputStackFactory; import org.spoofax.jsglr2.inputstack.LayoutSensitiveInputStack; import org.spoofax.jsglr2.inputstack.incremental.IIncrementalInputStack; import org.spoofax.jsglr2.inputstack.incremental.IncrementalInputStack; import org.spoofax.jsglr2.inputstack.incremental.IncrementalInputStackFactory; import org.spoofax.jsglr2.inputstack.incremental.LinkedIncrementalInputStack; import org.spoofax.jsglr2.layoutsensitive.LayoutSensitiveParseForestManager; import org.spoofax.jsglr2.layoutsensitive.LayoutSensitiveReduceManager; import org.spoofax.jsglr2.parseforest.*; import org.spoofax.jsglr2.parseforest.basic.BasicParseForestManager; import org.spoofax.jsglr2.parseforest.empty.NullParseForestManager; import org.spoofax.jsglr2.parseforest.hybrid.HybridParseForestManager; import org.spoofax.jsglr2.parser.failure.DefaultParseFailureHandler; import org.spoofax.jsglr2.recovery.*; import org.spoofax.jsglr2.recoveryincremental.RecoveryIncrementalParseState; import org.spoofax.jsglr2.reducing.*; import org.spoofax.jsglr2.stack.IStackNode; import org.spoofax.jsglr2.stack.StackRepresentation; import org.spoofax.jsglr2.stack.basic.BasicStackManager; import org.spoofax.jsglr2.stack.collections.*; import org.spoofax.jsglr2.stack.hybrid.HybridStackManager; public class ParserVariant { public final ActiveStacksRepresentation activeStacksRepresentation; public final ForActorStacksRepresentation forActorStacksRepresentation; public final ParseForestRepresentation parseForestRepresentation; public final ParseForestConstruction parseForestConstruction; public final StackRepresentation stackRepresentation; public final Reducing reducing; public final boolean recovery; public ParserVariant(ActiveStacksRepresentation activeStacksRepresentation, ForActorStacksRepresentation forActorStacksRepresentation, ParseForestRepresentation parseForestRepresentation, ParseForestConstruction parseForestConstruction, StackRepresentation stackRepresentation, Reducing reducing, boolean recovery) { this.activeStacksRepresentation = activeStacksRepresentation; this.forActorStacksRepresentation = forActorStacksRepresentation; this.parseForestRepresentation = parseForestRepresentation; this.parseForestConstruction = parseForestConstruction; this.stackRepresentation = stackRepresentation; this.reducing = reducing; this.recovery = recovery; } public boolean isValid() { return validate().count() == 0; } public Stream<String> validate() { // Implication N -> F is written as !N || F // Bi-implication N <-> F is written as N == F Map<String, Boolean> constraints = new HashMap<>(); constraints.put("both Reducing and StackRepresentation should use Elkhound", (reducing == Reducing.Elkhound) == (stackRepresentation == StackRepresentation.BasicElkhound || stackRepresentation == StackRepresentation.HybridElkhound)); constraints.put("null parse forest requires Full parse forest construction", parseForestRepresentation != ParseForestRepresentation.Null || parseForestConstruction == ParseForestConstruction.Full); constraints.put("both Reducing and ParseForestRepresentation should use Incremental", (parseForestRepresentation == ParseForestRepresentation.Incremental) == (reducing == Reducing.Incremental)); constraints.put("both Reducing and ParseForestRepresentation should use LayoutSensitive", (parseForestRepresentation == ParseForestRepresentation.LayoutSensitive) == (reducing == Reducing.LayoutSensitive)); constraints.put("both Reducing and ParseForestRepresentation should use DataDependent", (parseForestRepresentation == ParseForestRepresentation.DataDependent) == (reducing == Reducing.DataDependent)); constraints.put("both Reducing and ParseForestRepresentation should use Composite", (parseForestRepresentation == ParseForestRepresentation.Composite) == (reducing == Reducing.Composite)); constraints.put("recovery and layout-sensitive parsing not simultaneously supported", !(parseForestRepresentation == ParseForestRepresentation.LayoutSensitive && recovery)); constraints.put("recovery and composite parsing not simultaneously supported", !(parseForestRepresentation == ParseForestRepresentation.Composite && recovery)); constraints.put("optimized parse forest construction and layout-sensitive parsing not simultaneously supported", !(parseForestRepresentation == ParseForestRepresentation.LayoutSensitive && parseForestConstruction == ParseForestConstruction.Optimized)); constraints.put("optimized parse forest construction and composite parsing not simultaneously supported", !(parseForestRepresentation == ParseForestRepresentation.Composite && parseForestConstruction == ParseForestConstruction.Optimized)); return constraints.entrySet().stream().filter(constraint -> !constraint.getValue()).map(Map.Entry::getKey); } public String name() { //@formatter:off return (activeStacksRepresentation == ActiveStacksRepresentation.standard() ? "_" : "ActiveStacksRepresentation:" + activeStacksRepresentation) + "/" + (forActorStacksRepresentation == ForActorStacksRepresentation.standard() ? "_" : "ForActorStacksRepresentation:" + forActorStacksRepresentation) + "/" + (parseForestRepresentation == ParseForestRepresentation.standard() ? "_" : "ParseForestRepresentation:" + parseForestRepresentation) + "/" + (parseForestConstruction == ParseForestConstruction.standard() ? "_" : "ParseForestConstruction:" + parseForestConstruction) + "/" + (stackRepresentation == StackRepresentation.standard() ? "_" : "StackRepresentation:" + stackRepresentation) + "/" + (reducing == Reducing.standard() ? "_" : "Reducing:" + reducing) + "/Recovery:" + recovery; //@formatter:on } @Override public boolean equals(Object o) { if(this == o) return true; if(o == null || getClass() != o.getClass()) return false; ParserVariant that = (ParserVariant) o; return activeStacksRepresentation == that.activeStacksRepresentation && forActorStacksRepresentation == that.forActorStacksRepresentation && parseForestRepresentation == that.parseForestRepresentation && parseForestConstruction == that.parseForestConstruction && stackRepresentation == that.stackRepresentation && reducing == that.reducing && recovery == that.recovery; } private static //@formatter:off <ParseForest extends IParseForest, Derivation extends IDerivation<ParseForest>, ParseNode extends IParseNode<ParseForest, Derivation>, StackNode extends IStackNode, InputStack extends IInputStack, BacktrackChoicePoint extends IBacktrackChoicePoint<InputStack, StackNode>, ParseState extends AbstractParseState<InputStack, StackNode> & IRecoveryParseState<InputStack, StackNode, BacktrackChoicePoint>> //@formatter:on IParser<? extends IParseForest> withRecovery(Parser<ParseForest, Derivation, ParseNode, StackNode, InputStack, ParseState, ?, ?> parser) { parser.reduceManager.addFilter(new RecoveryReduceActionFilter<>()); parser.observing.attachObserver(new RecoveryObserver<>()); return parser; } private static //@formatter:off <ParseForest extends IParseForest, StackNode extends IStackNode, ParseState extends AbstractParseState<?, StackNode>> //@formatter:on IParser<? extends IParseForest> withoutRecovery(Parser<ParseForest, ?, ?, StackNode, ?, ParseState, ?, ?> parser) { parser.reduceManager.addFilter(ReduceActionFilter.ignoreRecoveryAndCompletion()); return parser; } public IParser<? extends IParseForest> getParser(IParseTable parseTable) { return getParser(parseTable, new ActiveStacksFactory(activeStacksRepresentation), new ForActorStacksFactory(forActorStacksRepresentation)); } private //@formatter:off <ParseForest extends IParseForest, Derivation extends IDerivation<ParseForest>, ParseNode extends IParseNode<ParseForest, Derivation>, StackNode extends IStackNode, InputStack extends IInputStack, ParseState extends AbstractParseState<InputStack, StackNode>> //@formatter:on ReducerFactory<ParseForest, Derivation, ParseNode, StackNode, InputStack, ParseState> reducerFactory() { if(parseForestConstruction == ParseForestConstruction.Optimized) return ReducerOptimized.factoryOptimized(); else return Reducer.factory(); } private //@formatter:off <ParseForest extends IParseForest, Derivation extends IDerivation<ParseForest>, ParseNode extends IParseNode<ParseForest, Derivation>, StackNode extends IStackNode, InputStack extends IInputStack, BacktrackChoicePoint extends IBacktrackChoicePoint<InputStack, StackNode>, ParseState extends AbstractParseState<InputStack, StackNode> & IRecoveryParseState<InputStack, StackNode, BacktrackChoicePoint>> //@formatter:on ReducerFactory<ParseForest, Derivation, ParseNode, StackNode, InputStack, ParseState> recoveryReducerFactory() { if(parseForestConstruction == ParseForestConstruction.Optimized) return RecoveryReducerOptimized.factoryRecoveryOptimized(); else return Reducer.factory(); } public IParser<? extends IParseForest> getParser(IParseTable parseTable, IActiveStacksFactory activeStacksFactory, IForActorStacksFactory forActorStacksFactory) { if(!this.isValid()) throw new IllegalStateException("Invalid parser variant: " + validate().collect(Collectors.joining(", "))); // TODO parametrize parser on diff algorithm to see what is the most performant IStringDiff diff = new JGitHistogramDiff(); InputStackFactory<IInputStack> inputStackFactory = InputStack::new; InputStackFactory<LayoutSensitiveInputStack> inputStackFactoryLS = LayoutSensitiveInputStack::new; IncrementalInputStackFactory<IIncrementalInputStack> incrementalInputStackFactory = IncrementalInputStack.factory(diff); // TODO switch between Eager, Lazy, and Linked? // IncrementalInputStackFactory<IIncrementalInputStack> incrementalInputStackFactory = // EagerPreprocessingIncrementalInputStack.factory(diff, new ProcessUpdates(incrementalParseForestManager)); IncrementalInputStackFactory<IIncrementalInputStack> incrementalRecoveryInputStackFactory = LinkedIncrementalInputStack.factory(diff); // @formatter:off switch(this.parseForestRepresentation) { default: case Basic: switch(this.reducing) { case Elkhound: switch(this.stackRepresentation) { case BasicElkhound: if (this.recovery) return withRecovery(new ElkhoundParser<>(inputStackFactory, RecoveryParseState.factory(activeStacksFactory, forActorStacksFactory), parseTable, BasicElkhoundStackManager.factory(), BasicParseForestManager.factory(), new RecoveryDisambiguator<>(), ElkhoundReduceManager.factoryElkhound(recoveryReducerFactory()), RecoveryParseFailureHandler.factory(), RecoveryParseReporter.factory())); else return withoutRecovery(new ElkhoundParser<>(inputStackFactory, ParseState.factory(activeStacksFactory, forActorStacksFactory), parseTable, BasicElkhoundStackManager.factory(), BasicParseForestManager.factory(), null, ElkhoundReduceManager.factoryElkhound(reducerFactory()), DefaultParseFailureHandler.factory(), EmptyParseReporter.factory())); case HybridElkhound: if (this.recovery) return withRecovery(new ElkhoundParser<>(inputStackFactory, RecoveryParseState.factory(activeStacksFactory, forActorStacksFactory), parseTable, HybridElkhoundStackManager.factory(), BasicParseForestManager.factory(), new RecoveryDisambiguator<>(), ElkhoundReduceManager.factoryElkhound(recoveryReducerFactory()), RecoveryParseFailureHandler.factory(), RecoveryParseReporter.factory())); else return withoutRecovery(new ElkhoundParser<>(inputStackFactory, ParseState.factory(activeStacksFactory, forActorStacksFactory), parseTable, HybridElkhoundStackManager.factory(), BasicParseForestManager.factory(), null, ElkhoundReduceManager.factoryElkhound(reducerFactory()), DefaultParseFailureHandler.factory(), EmptyParseReporter.factory())); default: throw new IllegalStateException("Elkhound reducing requires Elkhound stack"); } case Basic: switch(this.stackRepresentation) { case Basic: if (this.recovery) return withRecovery(new Parser<>(inputStackFactory, RecoveryParseState.factory(activeStacksFactory, forActorStacksFactory), parseTable, BasicStackManager.factory(), BasicParseForestManager.factory(), new RecoveryDisambiguator<>(), ReduceManager.factory(recoveryReducerFactory()), RecoveryParseFailureHandler.factory(), RecoveryParseReporter.factory())); else return withoutRecovery(new Parser<>(inputStackFactory, ParseState.factory(activeStacksFactory, forActorStacksFactory), parseTable, BasicStackManager.factory(), BasicParseForestManager.factory(), null, ReduceManager.factory(reducerFactory()), DefaultParseFailureHandler.factory(), EmptyParseReporter.factory())); case Hybrid: if (this.recovery) return withRecovery(new Parser<>(inputStackFactory, RecoveryParseState.factory(activeStacksFactory, forActorStacksFactory), parseTable, HybridStackManager.factory(), BasicParseForestManager.factory(), new RecoveryDisambiguator<>(), ReduceManager.factory(recoveryReducerFactory()), RecoveryParseFailureHandler.factory(), RecoveryParseReporter.factory())); else return withoutRecovery(new Parser<>(inputStackFactory, ParseState.factory(activeStacksFactory, forActorStacksFactory), parseTable, HybridStackManager.factory(), BasicParseForestManager.factory(), null, ReduceManager.factory(reducerFactory()), DefaultParseFailureHandler.factory(), EmptyParseReporter.factory())); default: throw new IllegalStateException("Basic reducing requires Basic or Hybrid stack"); } default: throw new IllegalStateException("Only Elkhound or basic reducing possible with basic parse forest representation"); } case Null: switch(this.reducing) { case Elkhound: switch(this.stackRepresentation) { case BasicElkhound: if (this.recovery) return withRecovery(new ElkhoundParser<>(inputStackFactory, RecoveryParseState.factory(activeStacksFactory, forActorStacksFactory), parseTable, BasicElkhoundStackManager.factory(), NullParseForestManager.factory(), new RecoveryDisambiguator<>(), ElkhoundReduceManager.factoryElkhound(recoveryReducerFactory()), RecoveryParseFailureHandler.factory(), RecoveryParseReporter.factory())); else return withoutRecovery(new ElkhoundParser<>(inputStackFactory, ParseState.factory(activeStacksFactory, forActorStacksFactory), parseTable, BasicElkhoundStackManager.factory(), NullParseForestManager.factory(), null, ElkhoundReduceManager.factoryElkhound(reducerFactory()), DefaultParseFailureHandler.factory(), EmptyParseReporter.factory())); case HybridElkhound: if (this.recovery) return withRecovery(new ElkhoundParser<>(inputStackFactory, RecoveryParseState.factory(activeStacksFactory, forActorStacksFactory), parseTable, HybridElkhoundStackManager.factory(), NullParseForestManager.factory(), new RecoveryDisambiguator<>(), ElkhoundReduceManager.factoryElkhound(recoveryReducerFactory()), RecoveryParseFailureHandler.factory(), RecoveryParseReporter.factory())); else return withoutRecovery(new ElkhoundParser<>(inputStackFactory, ParseState.factory(activeStacksFactory, forActorStacksFactory), parseTable, HybridElkhoundStackManager.factory(), NullParseForestManager.factory(), null, ElkhoundReduceManager.factoryElkhound(reducerFactory()), DefaultParseFailureHandler.factory(), EmptyParseReporter.factory())); default: throw new IllegalStateException("Elkhound reducing requires Elkhound stack"); } case Basic: switch(this.stackRepresentation) { case Basic: if (this.recovery) return withRecovery(new Parser<>(inputStackFactory, RecoveryParseState.factory(activeStacksFactory, forActorStacksFactory), parseTable, BasicStackManager.factory(), NullParseForestManager.factory(), new RecoveryDisambiguator<>(), ReduceManager.factory(recoveryReducerFactory()), RecoveryParseFailureHandler.factory(), RecoveryParseReporter.factory())); else return withoutRecovery(new Parser<>(inputStackFactory, ParseState.factory(activeStacksFactory, forActorStacksFactory), parseTable, BasicStackManager.factory(), NullParseForestManager.factory(), null, ReduceManager.factory(reducerFactory()), DefaultParseFailureHandler.factory(), EmptyParseReporter.factory())); case Hybrid: if (this.recovery) return withRecovery(new Parser<>(inputStackFactory, RecoveryParseState.factory(activeStacksFactory, forActorStacksFactory), parseTable, HybridStackManager.factory(), NullParseForestManager.factory(), new RecoveryDisambiguator<>(), ReduceManager.factory(recoveryReducerFactory()), RecoveryParseFailureHandler.factory(), RecoveryParseReporter.factory())); else return withoutRecovery(new Parser<>(inputStackFactory, ParseState.factory(activeStacksFactory, forActorStacksFactory), parseTable, HybridStackManager.factory(), NullParseForestManager.factory(), null, ReduceManager.factory(reducerFactory()), DefaultParseFailureHandler.factory(), EmptyParseReporter.factory())); default: throw new IllegalStateException("Basic reducing requires Basic or Hybrid stack"); } default: throw new IllegalStateException("Only Elkhound or basic reducing possible with empty parse forest representation"); } case Hybrid: switch(this.reducing) { case Elkhound: switch(this.stackRepresentation) { case BasicElkhound: if (this.recovery) return withRecovery(new ElkhoundParser<>(inputStackFactory, RecoveryParseState.factory(activeStacksFactory, forActorStacksFactory), parseTable, BasicElkhoundStackManager.factory(), HybridParseForestManager.factory(), new RecoveryDisambiguator<>(), ElkhoundReduceManager.factoryElkhound(recoveryReducerFactory()), RecoveryParseFailureHandler.factory(), RecoveryParseReporter.factory())); else return withoutRecovery(new ElkhoundParser<>(inputStackFactory, ParseState.factory(activeStacksFactory, forActorStacksFactory), parseTable, BasicElkhoundStackManager.factory(), HybridParseForestManager.factory(), null, ElkhoundReduceManager.factoryElkhound(reducerFactory()), DefaultParseFailureHandler.factory(), EmptyParseReporter.factory())); case HybridElkhound: if (this.recovery) return withRecovery(new ElkhoundParser<>(inputStackFactory, RecoveryParseState.factory(activeStacksFactory, forActorStacksFactory), parseTable, HybridElkhoundStackManager.factory(), HybridParseForestManager.factory(), new RecoveryDisambiguator<>(), ElkhoundReduceManager.factoryElkhound(recoveryReducerFactory()), RecoveryParseFailureHandler.factory(), RecoveryParseReporter.factory())); else return withoutRecovery(new ElkhoundParser<>(inputStackFactory, ParseState.factory(activeStacksFactory, forActorStacksFactory), parseTable, HybridElkhoundStackManager.factory(), HybridParseForestManager.factory(), null, ElkhoundReduceManager.factoryElkhound(reducerFactory()), DefaultParseFailureHandler.factory(), EmptyParseReporter.factory())); default: throw new IllegalStateException("Elkhound reducing requires Elkhound stack"); } case Basic: switch(this.stackRepresentation) { case Basic: if (this.recovery) return withRecovery(new Parser<>(inputStackFactory, RecoveryParseState.factory(activeStacksFactory, forActorStacksFactory), parseTable, BasicStackManager.factory(), HybridParseForestManager.factory(), new RecoveryDisambiguator<>(), ReduceManager.factory(recoveryReducerFactory()), RecoveryParseFailureHandler.factory(), RecoveryParseReporter.factory())); else return withoutRecovery(new Parser<>(inputStackFactory, ParseState.factory(activeStacksFactory, forActorStacksFactory), parseTable, BasicStackManager.factory(), HybridParseForestManager.factory(), null, ReduceManager.factory(reducerFactory()), DefaultParseFailureHandler.factory(), EmptyParseReporter.factory())); case Hybrid: if (this.recovery) return withRecovery(new Parser<>(inputStackFactory, RecoveryParseState.factory(activeStacksFactory, forActorStacksFactory), parseTable, HybridStackManager.factory(), HybridParseForestManager.factory(), new RecoveryDisambiguator<>(), ReduceManager.factory(recoveryReducerFactory()), RecoveryParseFailureHandler.factory(), RecoveryParseReporter.factory())); else return withoutRecovery(new Parser<>(inputStackFactory, ParseState.factory(activeStacksFactory, forActorStacksFactory), parseTable, HybridStackManager.factory(), HybridParseForestManager.factory(), null, ReduceManager.factory(reducerFactory()), DefaultParseFailureHandler.factory(), EmptyParseReporter.factory())); default: throw new IllegalStateException("Basic reducing requires Basic or Hybrid stack"); } default: throw new IllegalStateException("Only Elkhound or basic reducing possible with hybrid parse forest representation"); } case DataDependent: if(this.reducing != Reducing.DataDependent) throw new IllegalStateException(); switch(this.stackRepresentation) { case Basic: if (this.recovery) return withRecovery(new Parser<>(inputStackFactory, RecoveryParseState.factory(activeStacksFactory, forActorStacksFactory), parseTable, BasicStackManager.factory(), DataDependentParseForestManager.factory(), new RecoveryDisambiguator<>(), DataDependentReduceManager.factoryDataDependent(recoveryReducerFactory()), RecoveryParseFailureHandler.factory(), RecoveryParseReporter.factory())); else return withoutRecovery(new Parser<>(inputStackFactory, ParseState.factory(activeStacksFactory, forActorStacksFactory), parseTable, BasicStackManager.factory(), DataDependentParseForestManager.factory(), null, DataDependentReduceManager.factoryDataDependent(reducerFactory()), DefaultParseFailureHandler.factory(), EmptyParseReporter.factory())); case Hybrid: if (this.recovery) return withRecovery(new Parser<>(inputStackFactory, RecoveryParseState.factory(activeStacksFactory, forActorStacksFactory), parseTable, HybridStackManager.factory(), DataDependentParseForestManager.factory(), new RecoveryDisambiguator<>(), DataDependentReduceManager.factoryDataDependent(recoveryReducerFactory()), RecoveryParseFailureHandler.factory(), RecoveryParseReporter.factory())); else return withoutRecovery(new Parser<>(inputStackFactory, ParseState.factory(activeStacksFactory, forActorStacksFactory), parseTable, HybridStackManager.factory(), DataDependentParseForestManager.factory(), null, DataDependentReduceManager.factoryDataDependent(reducerFactory()), DefaultParseFailureHandler.factory(), EmptyParseReporter.factory())); default: throw new IllegalStateException(); } case LayoutSensitive: if(this.reducing != Reducing.LayoutSensitive || this.recovery) throw new IllegalStateException(); switch(this.stackRepresentation) { case Basic: //if (this.recovery) // return withRecovery(new Parser<>(inputStackFactoryLS, RecoveryParseState.factory(activeStacksFactory, forActorStacksFactory), parseTable, BasicStackManager.factory(), LayoutSensitiveParseForestManager.factory(), new RecoveryDisambiguator<>(), LayoutSensitiveReduceManager.factoryLayoutSensitive(recoveryReducerFactory()), RecoveryParseFailureHandler.factory(), RecoveryParseReporter.factory())); //else return withoutRecovery(new Parser<>(inputStackFactoryLS, ParseState.factory(activeStacksFactory, forActorStacksFactory), parseTable, BasicStackManager.factory(), LayoutSensitiveParseForestManager.factory(), null, LayoutSensitiveReduceManager.factoryLayoutSensitive(reducerFactory()), DefaultParseFailureHandler.factory(), EmptyParseReporter.factory())); case Hybrid: //if (this.recovery) // return withRecovery(new Parser<>(inputStackFactoryLS, RecoveryParseState.factory(activeStacksFactory, forActorStacksFactory), parseTable, HybridStackManager.factory(), LayoutSensitiveParseForestManager.factory(), new RecoveryDisambiguator<>(), LayoutSensitiveReduceManager.factoryLayoutSensitive(recoveryReducerFactory()), RecoveryParseFailureHandler.factory(), RecoveryParseReporter.factory())); //else return withoutRecovery(new Parser<>(inputStackFactoryLS, ParseState.factory(activeStacksFactory, forActorStacksFactory), parseTable, HybridStackManager.factory(), LayoutSensitiveParseForestManager.factory(), null, LayoutSensitiveReduceManager.factoryLayoutSensitive(reducerFactory()), DefaultParseFailureHandler.factory(), EmptyParseReporter.factory())); default: throw new IllegalStateException(); } case Composite: if(this.reducing != Reducing.Composite || this.recovery) throw new IllegalStateException(); switch(this.stackRepresentation) { case Basic: //if (this.recovery) // return withRecovery(new Parser<>(inputStackFactoryLS, RecoveryParseState.factory(activeStacksFactory, forActorStacksFactory), parseTable, BasicStackManager.factory(), LayoutSensitiveParseForestManager.factory(), new RecoveryDisambiguator<>(), LayoutSensitiveReduceManager.factoryLayoutSensitive(recoveryReducerFactory)_), RecoveryParseFailureHandler.factory(), RecoveryParseReporter.factory())); //else return withoutRecovery(new Parser<>(inputStackFactoryLS, ParseState.factory(activeStacksFactory, forActorStacksFactory), parseTable, BasicStackManager.factory(), CompositeParseForestManager.factory(), null, CompositeReduceManager.factoryComposite(reducerFactory()), DefaultParseFailureHandler.factory(), EmptyParseReporter.factory())); case Hybrid: //if (this.recovery) // return withRecovery(new Parser<>(inputStackFactoryLS, RecoveryParseState.factory(activeStacksFactory, forActorStacksFactory), parseTable, HybridStackManager.factory(), LayoutSensitiveParseForestManager.factory(), new RecoveryDisambiguator<>(), LayoutSensitiveReduceManager.factoryLayoutSensitive(recoveryReducerFactory()), RecoveryParseFailureHandler.factory(), RecoveryParseReporter.factory())); //else return withoutRecovery(new Parser<>(inputStackFactoryLS, ParseState.factory(activeStacksFactory, forActorStacksFactory), parseTable, HybridStackManager.factory(), CompositeParseForestManager.factory(), null, CompositeReduceManager.factoryComposite(reducerFactory()), DefaultParseFailureHandler.factory(), EmptyParseReporter.factory())); default: throw new IllegalStateException(); } case Incremental: if(this.reducing != Reducing.Incremental) throw new IllegalStateException(); switch(this.stackRepresentation) { case Basic: if (this.recovery) return withRecovery(new IncrementalParser<>(incrementalRecoveryInputStackFactory, RecoveryIncrementalParseState.factory(activeStacksFactory, forActorStacksFactory), parseTable, BasicStackManager.factory(), IncrementalParseForestManager.factory(), new RecoveryDisambiguator<>(), IncrementalReduceManager.factoryIncremental(recoveryReducerFactory()), RecoveryParseFailureHandler.factory(), RecoveryParseReporter.factory())); else return withoutRecovery(new IncrementalParser<>( incrementalInputStackFactory, IncrementalParseState.factory(activeStacksFactory, forActorStacksFactory), parseTable, BasicStackManager.factory(), IncrementalParseForestManager.factory(), null, IncrementalReduceManager.factoryIncremental(reducerFactory()), DefaultParseFailureHandler.factory(), EmptyParseReporter.factory())); case Hybrid: if (this.recovery) return withRecovery(new IncrementalParser<>(incrementalRecoveryInputStackFactory, RecoveryIncrementalParseState.factory(activeStacksFactory, forActorStacksFactory), parseTable, HybridStackManager.factory(), IncrementalParseForestManager.factory(), new RecoveryDisambiguator<>(), IncrementalReduceManager.factoryIncremental(recoveryReducerFactory()), RecoveryParseFailureHandler.factory(), RecoveryParseReporter.factory())); else return withoutRecovery(new IncrementalParser<>( incrementalInputStackFactory, IncrementalParseState.factory(activeStacksFactory, forActorStacksFactory), parseTable, HybridStackManager.factory(), IncrementalParseForestManager.factory(), null, IncrementalReduceManager.factoryIncremental(reducerFactory()), DefaultParseFailureHandler.factory(), EmptyParseReporter.factory())); default: throw new IllegalStateException(); } } // @formatter:on } }
FVB.129P2Pde6b+ Tyrcch/Ant, a sighted variant of the FVB/N mouse strain suitable for behavioral analysis Mice of the FVB/N strain are severely visual impaired as a result of tyrosinase gene defects, leading to a deficiency of the key enzyme for melanin synthesis in skin and eye and of cyclic guanosine monophosphate phosphodiesterase gene defects, which results in albinism (Tyrc/c) and retinal degeneration (Pde6brd1/rd1), respectively. Nevertheless, FVB/N mice are commonly used for the generation of transgenic animals because of their large, strong pronuclei and high breeding performance. However, due to visual impairment of the FVB/N animals, the resulting transgenic animals cannot be used in tests that depend on vision, including tests of cognitive behavior. Therefore, we have bred a sighted version of the FVB/N strain by an outcross between FVB/N and 129P2/OlaHsd, followed by repeated backcrosses to FVB/N mice while selecting against albinism and homozygosity of the retinal degeneration mutation. After 11 generations of backcrossing, sighted animals were intercrossed to generate the congenic FVB.129P2Pde6b+ Tyrcch/Ant strain, which is pigmented (Tyrcch/cch) and devoid of the genetic predisposition to retinal degeneration. The accurate visual abilities of the FVB.129P2Pde6b+ Tyrcch/Ant mice, for which we propose the name FVBS/Ant, demonstrated a clear visual evoked potential in the presence of normal eye histology and improved performance in the Morris water maze test.
3D computer vision and applications In order to overcome the shortcoming of the current stereo vision, a reliable stereo method is proposed making use of the reliability of features in determining correspondence. A rich description of a scene is obtained by integrating range data and color data. Interpretation of a scene requires the use of the knowledge or models of objects. Based on the knowledge, most effective process is triggered at each stage depending on what kind of information is obtained at that stage. In this paradigm, a monocular image of a 3D scene can also be interpreted. The author describes applications of 3D computer vision and his approach to a more flexible 3D computer vision.<<ETX>>
<filename>Nets/train_nodule.py<gh_stars>10-100 import sys sys.path.append('../Nets/') from glob import glob from os.path import join from multiprocessing import Pool from scipy.ndimage.interpolation import rotate from keras.callbacks import ModelCheckpoint from tqdm import tqdm from functools import partial from Nodule import * from numpy import * PATH = { 'DATA': '/home/a.dobrenkii/Projects/Kaggle/DataScienceBowl2K17/data/TRAIN', 'DATA_OUT': '/home/a.dobrenkii/Projects/Kaggle/DataScienceBowl2K17/data/TRAIN_OUT', 'CSV': '/home/a.dobrenkii/Projects/Kaggle/DataScienceBowl2K17/data/CSV', 'LABELS': '/home/a.dobrenkii/Projects/Kaggle/DataScienceBowl2K17/data/CSV/stage1_labels.csv', 'LUNA': '/fasthome/a.dobrenkii/LUNA', 'LUNA_DATA': '/fasthome/a.dobrenkii/LUNA/DATA', 'LUNA_SOBEL': '/fasthome/a.dobrenkii/LUNA/SOBEL_IMG', 'LUNA_LUNGS': '/fasthome/a.dobrenkii/LUNA/LUNGS_IMG', 'LUNA_MASKS': '/fasthome/a.dobrenkii/LUNA/MASKS', 'LUNA_CSV': '/fasthome/a.dobrenkii/LUNA/CSVFILES', 'LUNA_PRED': '/fasthome/a.dobrenkii/LUNA/PRED', 'PATCH_PATHS': '/home/a.dobrenkii/Projects/Kaggle/DataScienceBowl2K17/data/LUNA/OUT/PATCHES', 'LUNA_NODULES': '/home/a.dobrenkii/Projects/Kaggle/DataScienceBowl2K17/data/LUNA/OUT/PATCHES/NODULES', 'LUNA_VESSELS': '/home/a.dobrenkii/Projects/Kaggle/DataScienceBowl2K17/data/LUNA/OUT/PATCHES/VESSELS', 'WEIGHTS': '/home/a.dobrenkii/Projects/Kaggle/DataScienceBowl2K17/data/WEIGHTS', 'CHECKPOINTS': '/home/a.dobrenkii/Projects/Kaggle/DataScienceBowl2K17/data' } CPU = 10 ANGLE = 35 SHIFT = 4 SHAPE = (8, 64, 64) TRAIN_TEST = .2 NB_EPOCH = 50 model, bottle_neck = dim_concentration() model.compile('adam', 'mse') def augment_patch(patch, shape, angle=15, shift=4): if angle: shift = random.randint(-shift, shift, 3) patch = rotate(patch, random.uniform(-angle, angle), axes=[1,2]) patch = rotate(patch, random.uniform(-angle, angle), axes=[0,1]) patch = rotate(patch, random.uniform(-angle, angle), axes=[0,2]) center = (array(patch.shape) // 2) + shift left = array(shape) // 2 right = array(shape) - left patch = patch[center[0] - left[0]:center[0] + right[0], center[1] - left[1]:center[1] + right[1], center[2] - left[2]:center[2] + right[2]] mn = patch.min() mx = patch.max() if (mx - mn) != 0: patch = (patch - mn) / (mx - mn) else: patch[:, :, :] = 0. return patch def batch_generator(patch_paths, batch_size, shape=(8, 64, 64), angle=15, shift=4, CPU=4): number_of_batches = ceil(len(patch_paths) / batch_size) counter = 0 random.shuffle(patch_paths) while True: batch_files = patch_paths[batch_size * counter:batch_size * (counter + 1)] patch_list = [load(patch_path) for patch_path in batch_files] augment = partial(augment_patch, shape=shape, angle=angle, shift=shift) with Pool(CPU) as pool: patch_list = pool.map(augment, patch_list) counter += 1 yield expand_dims(array(patch_list), 1), expand_dims(array(patch_list), 1) if counter == number_of_batches: random.shuffle(patch_paths) counter = 0 # patch_paths = glob(join(PATH['LUNA_NODULES'], '*_patch.npy')) # patch_paths += glob(join(PATH['LUNA_VESSELS'], '*_patch.npy')) # shuffle(patch_paths) # save(join(PATH['PATCH_PATHS'], 'LUNA'), array(patch_paths)) patch_paths = load(join(PATH['PATCH_PATHS'], 'LUNA.npy')) train = patch_paths[int(len(patch_paths) * .2):] valid = patch_paths[:int(len(patch_paths) * .2)] SAMPLES_PER_EPOCH = len(train) NB_VAL_SAMPLES = len(valid) train_generator = batch_generator(train, batch_size=32, shape=SHAPE, angle=ANGLE, shift=SHIFT, CPU=CPU) valid_generator = batch_generator(valid, batch_size=32, shape=SHAPE, angle=0, shift=0, CPU=CPU) checkpoint = ModelCheckpoint(filepath=join(PATH['WEIGHTS'], '3DCAE_nodule_model'), verbose=1, save_best_only=True) model.fit_generator(train_generator, samples_per_epoch=1853, nb_epoch=NB_EPOCH, callbacks=[checkpoint], validation_data=valid_generator, class_weight=None, nb_val_samples=463, nb_worker=1)
/** * Callback handler make list of JobIds */ static int jobid_handler(void *ctx, int num_fields, char **row) { RESTORE_CTX *rx = (RESTORE_CTX *)ctx; if (bstrcmp(rx->last_jobid, row[0])) { return 0; } bstrncpy(rx->last_jobid, row[0], sizeof(rx->last_jobid)); if (rx->JobIds[0] != 0) { pm_strcat(rx->JobIds, ","); } pm_strcat(rx->JobIds, row[0]); return 0; }
When Pine Bush plays a non-league game Thursday at Washingtonville, it will be Carillo vs. Carillo. Pine Bush head coach Joe Carillo, who is in his 22nd season as a varsity head coach and won career game No. 250 last fall, will square off against his nephew, Jack Carillo, who's making his debut as a varsity head coach this season. "It should be a lot of fun," Jack Carillo said. "We'll probably have a couple jokes back and forth during the game and before the game and after the game, but at the end of the day, we're still family. I still play with his kids, my cousins, all the time. It's always been a competitive family. We compete over the stupidest things." Jack Carillo, a 2005 Ramapo High School grad, played college soccer at Nyack and SUNY Cortland. He coached at Nyack, Cortland and Wells College before taking Washingtonville's junior varsity job last year. He also coaches at Primo Sports in Florida and trains Joe Carillo's grandchildren, Gianni and Taylor Carillo. "Whenever we get together, we'll spend an hour just talking soccer," Joe Carillo said. "He helps train my grandchildren as goalkpeers. We're a close Italian family." "At the end of the day, it will probably be one of the few games that I think will be difficult just because of how badly he wants it and how badly I want it," said Jack Carillo.
<gh_stars>1000+ /* * Copyright (c) 2008-2021, Hazelcast, Inc. All Rights Reserved. * * Licensed under the Apache License, Version 2.0 (the "License"); * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * See the License for the specific language governing permissions and * limitations under the License. */ package com.hazelcast.cp.internal; import com.hazelcast.cp.event.CPGroupAvailabilityEvent; import java.util.EventListener; /** * Unique registration key for each &lt;CPGroupAvailabilityEvent, CPGroupAvailabilityListener&gt; pair. * It's used to deduplicate multiple events with the same signature per listener. */ final class CPGroupAvailabilityEventKey { final CPGroupAvailabilityEvent event; final EventListener listener; CPGroupAvailabilityEventKey(CPGroupAvailabilityEvent event, EventListener listener) { this.event = event; this.listener = listener; } @Override public boolean equals(Object o) { if (this == o) { return true; } if (o == null || getClass() != o.getClass()) { return false; } CPGroupAvailabilityEventKey that = (CPGroupAvailabilityEventKey) o; return listener.equals(that.listener) && event.equals(that.event); } @Override public int hashCode() { int result = event.hashCode(); result = 31 * result + listener.hashCode(); return result; } }
/** Configuration for various ThreadPool Settings used in nrtsearch */ public class ThreadPoolConfiguration { private static final int DEFAULT_MAX_SEARCHING_THREADS = ((Runtime.getRuntime().availableProcessors() * 3) / 2) + 1; private static final int DEFAULT_MAX_SEARCH_BUFFERED_ITEMS = Math.max(1000, 2 * DEFAULT_MAX_SEARCHING_THREADS); private static final int DEFAULT_MAX_INDEXING_THREADS = Runtime.getRuntime().availableProcessors() + 1; private static final int DEFAULT_MAX_INDEXING_BUFFERED_ITEMS = Math.max(200, 2 * DEFAULT_MAX_INDEXING_THREADS); private static final int DEFAULT_MAX_GRPC_LUCENESERVER_THREADS = DEFAULT_MAX_INDEXING_THREADS; private static final int DEFAULT_MAX_GRPC_LUCENESERVER_BUFFERED_ITEMS = DEFAULT_MAX_INDEXING_BUFFERED_ITEMS; private static final int DEFAULT_MAX_GRPC_REPLICATIONSERVER_THREADS = DEFAULT_MAX_INDEXING_THREADS; private static final int DEFAULT_MAX_GRPC_REPLICATIONSERVER_BUFFERED_ITEMS = DEFAULT_MAX_INDEXING_BUFFERED_ITEMS; private final int maxSearchingThreads; private final int maxSearchBufferedItems; private final int maxIndexingThreads; private final int maxIndexingBufferedItems; private final int maxGrpcLuceneserverThreads; private final int maxGrpcLuceneserverBufferedItems; private final int maxGrpcReplicationserverThreads; private final int maxGrpcReplicationserverBufferedItems; public ThreadPoolConfiguration(YamlConfigReader configReader) { maxSearchingThreads = configReader.getInteger( "threadPoolConfiguration.maxSearchingThreads", DEFAULT_MAX_SEARCHING_THREADS); maxSearchBufferedItems = configReader.getInteger( "threadPoolConfiguration.maxSearchBufferedItems", DEFAULT_MAX_SEARCH_BUFFERED_ITEMS); maxIndexingThreads = configReader.getInteger( "threadPoolConfiguration.maxIndexingThreads", DEFAULT_MAX_INDEXING_THREADS); maxIndexingBufferedItems = configReader.getInteger( "threadPoolConfiguration.maxIndexingBufferedItems", DEFAULT_MAX_INDEXING_BUFFERED_ITEMS); maxGrpcLuceneserverThreads = configReader.getInteger( "threadPoolConfiguration.maxGrpcLuceneserverThreads", DEFAULT_MAX_GRPC_LUCENESERVER_THREADS); maxGrpcLuceneserverBufferedItems = configReader.getInteger( "threadPoolConfiguration.maxGrpcLuceneserverBufferedItems", DEFAULT_MAX_GRPC_LUCENESERVER_BUFFERED_ITEMS); maxGrpcReplicationserverThreads = configReader.getInteger( "threadPoolConfiguration.maxGrpcReplicationserverThreads", DEFAULT_MAX_GRPC_REPLICATIONSERVER_THREADS); maxGrpcReplicationserverBufferedItems = configReader.getInteger( "threadPoolConfiguration.maxGrpcReplicationserverBufferedItems", DEFAULT_MAX_GRPC_REPLICATIONSERVER_BUFFERED_ITEMS); } public int getMaxSearchingThreads() { return maxSearchingThreads; } public int getMaxSearchBufferedItems() { return maxSearchBufferedItems; } public int getMaxIndexingThreads() { return maxIndexingThreads; } public int getMaxIndexingBufferedItems() { return maxIndexingBufferedItems; } public int getMaxGrpcLuceneserverThreads() { return maxGrpcLuceneserverThreads; } public int getMaxGrpcReplicationserverThreads() { return maxGrpcReplicationserverThreads; } public int getMaxGrpcLuceneserverBufferedItems() { return maxGrpcLuceneserverBufferedItems; } public int getMaxGrpcReplicationserverBufferedItems() { return maxGrpcReplicationserverBufferedItems; } }
Golf Origin and history While the modern game of golf originated in 15th-century Scotland, the game's ancient origins are unclear and much debated. Some historians trace the sport back to the Roman game of paganica, in which participants used a bent stick to hit a stuffed leather ball. One theory asserts that paganica spread throughout Europe as the Romans conquered most of the continent, during the first century BC, and eventually evolved into the modern game. Others cite chuiwan (捶丸; "chui" means striking and "wan" means small ball) as the progenitor, a Chinese game played between the eighth and fourteenth centuries. A Ming Dynasty scroll by the artist Youqiu dating back to 1368 entitled "The Autumn Banquet" shows a member of the Chinese Imperial court swinging what appears to be a golf club at a small ball with the aim of sinking it into a hole. The game is thought to have been introduced into Europe during the Middle Ages. Another early game that resembled modern golf was known as cambuca in England and chambot in France. The Persian game chaugán is another possible ancient origin. In addition, kolven (a game involving a ball and curved bats) was played annually in Loenen, Netherlands, beginning in 1297, to commemorate the capture of the assassin of Floris V, a year earlier. The modern game originated in Scotland, where the first written record of golf is James II's banning of the game in 1457, as an unwelcome distraction to learning archery. James IV lifted the ban in 1502 when he became a golfer himself, with golf clubs first recorded in 1503–1504: "For golf clubbes and balles to the King that he playit with". To many golfers, the Old Course at St Andrews, a links course dating to before 1574, is considered to be a site of pilgrimage. In 1764, the standard 18-hole golf course was created at St Andrews when members modified the course from 22 to 18 holes. Golf is documented as being played on Musselburgh Links, East Lothian, Scotland as early as 2 March 1672, which is certified as the oldest golf course in the world by Guinness World Records. The oldest surviving rules of golf were compiled in March 1744 for the Company of Gentlemen Golfers, later renamed The Honourable Company of Edinburgh Golfers, which was played at Leith, Scotland. The world's oldest golf tournament in existence, and golf's first major, is The Open Championship, which was first played on 17 October 1860 at Prestwick Golf Club, in Ayrshire, Scotland, with Scottish golfers winning the earliest majors. Two Scotsmen from Dunfermline, John Reid and Robert Lockhart, first demonstrated golf in the U.S. by setting up a hole in an orchard in 1888, with Reid setting up America's first golf club the same year, Saint Andrew's Golf Club in Yonkers, New York. Golf course A golf course consists of either 9 or 18 holes, each with a teeing ground that is set off by two markers showing the bounds of the legal tee area, fairway, rough and other hazards, and the putting green surrounded by the fringe with the pin (normally a flagstick) and cup. The levels of grass are varied to increase difficulty, or to allow for putting in the case of the green. While many holes are designed with a direct line-of-sight from the teeing area to the green, some holes may bend either to the left or to the right. This is commonly called a "dogleg", in reference to a dog's knee. The hole is called a "dogleg left" if the hole angles leftwards and "dogleg right" if it bends right. Sometimes, a hole's direction may bend twice; this is called a "double dogleg". A regular golf course consists of 18 holes, but nine-hole courses are common and can be played twice through for a full round of 18 holes. Early Scottish golf courses were primarily laid out on links land, soil-covered sand dunes directly inland from beaches. This gave rise to the term "golf links", particularly applied to seaside courses and those built on naturally sandy soil inland. The first 18-hole golf course in the United States was on a sheep farm in Downers Grove, Illinois, in 1892. The course is still there today. Play of the game Every round of golf is based on playing a number of holes in a given order. A "round" typically consists of 18 holes that are played in the order determined by the course layout. Each hole is played once in the round on a standard course of 18 holes. The game can be played by any number of people, although a typical group playing will have 1-4 people playing the round. The typical amount of time required for pace of play for a 9-hole round is two hours and four hours for an 18-hole round. Playing a hole on a golf course is initiated by putting a ball into play by striking it with a club on the teeing ground (also called the tee box, or simply the tee). For this first shot on each hole, it is allowed but not required for the golfer to place the ball on a tee prior to striking it. A tee is a small peg that can be used to elevate the ball slightly above the ground up to a few centimetres high. Tees are commonly made of wood but may be constructed of any material, including plastic. Traditionally, golfers used mounds of sand to elevate the ball, and containers of sand were provided for the purpose. A few courses still require sand to be used instead of peg tees, to reduce litter and reduce damage to the teeing ground. Tees help reduce the interference of the ground or grass on the movement of the club making the ball easier to hit, and also places the ball in the very centre of the striking face of the club (the "sweet spot") for better distance. When the initial shot on a hole is intended to move the ball a long distance, typically more than 225 yards (210 m), the shot is commonly called a "drive" and is generally made with a long-shafted, large-headed wood club called a "driver". Shorter holes may be initiated with other clubs, such as higher-numbered woods or irons. Once the ball comes to rest, the golfer strikes it again as many times as necessary using shots that are variously known as a "lay-up", an "approach", a "pitch", or a "chip", until the ball reaches the green, where he or she then "putts" the ball into the hole (commonly called "sinking the putt" or "holing out"). The goal of getting the ball into the hole ("holing" the ball) in as few strokes as possible may be impeded by obstacles such as areas of longer grass called "rough" (usually found alongside fairways), which both slows any ball that contacts it and makes it harder to advance a ball that has stopped on it; "doglegs", which are changes in the direction of the fairway that often require shorter shots to play around them; bunkers (or sand traps); and water hazards such as ponds or streams. In stroke play competitions played according to strict rules, each player plays his or her ball until it is holed no matter how many strokes that may take. In match play it is acceptable to simply pick up one's ball and "surrender the hole" after enough strokes have been made by a player that it is mathematically impossible for the player to win the hole. It is also acceptable in informal stroke play to surrender the hole after hitting three strokes more than the "par" rating of the hole (a "triple bogey" - see below); while technically a violation of Rule 3-2, this practice speeds play as a courtesy to others, and avoids "runaway scores", excessive frustration and injuries caused by overexertion. The total distance from the first tee box to the 18th green can be quite long; total yardages "through the green" can be in excess of 7,000 yards (6.4 km), and when adding in the travel distance between the green of one hole and the tee of the next, even skilled players may easily travel five miles (8 km) or more during a round. At some courses, electric golf carts are used to travel between shots, which can speed-up play and allows participation by individuals unable to walk a whole round. On other courses players generally walk the course, either carrying their bag using a shoulder strap or using a "golf trolley" for their bag. These trolleys may or may not be battery assisted. At many amateur tournaments including U.S. high school and college play, players are required to walk and to carry their own bags, but at the professional and top amateur level, as well as at high-level private clubs, players may be accompanied by caddies, who carry and manage the players' equipment and who are allowed by the rules to give advice on the play of the course. A caddie's advice can only be given to the player or players for whom the caddie is working, and not to other competing players. Penalties Penalties are incurred in certain situations. They are counted towards a player's score as if there were extra swing(s) at the ball. Strokes are added for rule infractions or for hitting one's ball into an unplayable situation. A lost ball or a ball hit out of bounds result in a penalty of one stroke and distance (Rule 27–1). A one-stroke penalty is assessed if a player's equipment causes the ball to move or the removal of a loose impediment causes the ball to move (Rule 18–2). A one-stroke penalty is assessed if a player's ball results into a red or yellow staked hazard (Rule 26). If a golfer makes a stroke at the wrong ball (Rule 19–2) or hits a fellow golfer's ball with a putt (Rule 19–5), the player incurs a two-stroke penalty. Most rule infractions lead to stroke penalties but also can lead to disqualification. Disqualification could be from cheating, signing for a lower score, or from rule infractions that lead to improper play. Equipment Golf clubs are used to hit the golf ball. Each club is composed of a shaft with a lance (or "grip") on the top end and a club head on the bottom. Long clubs, which have a lower amount of degree loft, are those meant to propel the ball a comparatively longer distance, and short clubs a higher degree of loft and a comparatively shorter distance. The actual physical length of each club is longer or shorter, depending on the distance the club is intended to propel the ball. Golf clubs have traditionally been arranged into three basic types. Woods are large-headed, long-shafted clubs meant to propel the ball a long distance from relatively "open" lies, such as the tee box and fairway. Of particular importance is the driver or "1-wood", which is the lowest lofted wood club, and in modern times has become highly specialized for making extremely long-distance tee shots, up to 300 yards (270 m), or more, in a professional golfer's hands. Traditionally these clubs had heads made of a hardwood, hence the name, but virtually all modern woods are now made of metal such as titanium, or of composite materials. Irons are shorter-shafted clubs with a metal head primarily consisting of a flat, angled striking face. Traditionally the clubhead was forged from iron; modern iron clubheads are investment-cast from a steel alloy. Irons of varying loft are used for a variety of shots from virtually anywhere on the course, but most often for shorter-distance shots approaching the green, or to get the ball out of tricky lies such as sand traps. The third class is the putter, which evolved from the irons to create a low-lofted, balanced club designed to roll the ball along the green and into the hole. Putters are virtually always used on the green or in the surrounding rough/fringe. A fourth class, called hybrids, evolved as a cross between woods and irons, and are typically seen replacing the low-lofted irons with a club that provides similar distance, but a higher launch angle and a more forgiving nature. A maximum of 14 clubs is allowed in a player's bag at one time during a stipulated round. The choice of clubs is at the golfer's discretion, although every club must be constructed in accordance with parameters outlined in the rules. (Clubs that meet these parameters are usually called "conforming".) Violation of these rules can result in disqualification. The exact shot hit at any given time on a golf course, and which club is used to accomplish the shot, are always completely at the discretion of the golfer; in other words, there is no restriction whatsoever on which club a golfer may or may not use at any time for any shot. Golf balls are spherical, usually white (although other colours are allowed), and minutely pock-marked by dimples that decrease aerodynamic drag by increasing air turbulence around the ball in motion, which delays "boundary layer" separation and reduces the drag-inducing "wake" behind the ball, thereby allowing the ball to fly farther. The combination of a soft "boundary layer" and a hard "core" enables both distance and spin. A tee is allowed only for the first stroke on each hole, unless the player must hit a provisional tee shot or replay his or her first shot from the tee. Many golfers wear golf shoes with metal or plastic spikes designed to increase traction, thus allowing for longer and more accurate shots. A golf bag is used to transport golf clubs and the player's other or personal equipment. Golf bags have several pockets designed for carrying equipment and supplies such as tees, balls, and gloves. Golf bags can be carried, pulled on a trolley or harnessed to a motorized golf cart during play. Golf bags usually have both a hand strap and shoulder strap for carrying, others may be carried over both shoulders like a backpack, and often bags have retractable legs that allow the bag to stand upright when at rest. Stroke mechanics The golf swing is outwardly similar to many other motions involving swinging a tool or playing implement, such as an axe or a baseball bat. However, unlike many of these motions, the result of the swing is highly dependent on several sub-motions being properly aligned and timed. These ensure that the club travels up to the ball in line with the desired path; that the clubface is in line with the swing path; and that the ball hits the centre or "sweet spot" of the clubface. The ability to do this consistently, across a complete set of clubs with a wide range of shaft lengths and clubface areas, is a key skill for any golfer, and takes a significant effort to achieve. Stance Stance refers to how the golfer positions themselves in order to play a stroke; it is fundamentally important in being able to play a stroke effectively. The stance adopted is determined by what stroke is being played. All stances involve a slight crouch. This allows for a more efficient striking posture whilst also isometrically preloading the muscles of the legs and core; this allows the stroke to be played more dynamically and with a greater level of overall control. When adopting their stance golfers start with the non-dominant side of the body facing the target (for a right-hander, the target is to their left). Setting the stance in regard to the position of the ball, and placing the clubhead behind the ball, is known as being at address; when in this position the player's body and the centerline of the club face are positioned parallel to the desired line of travel, with the feet either perpendicular to that line or slightly splayed outward. The feet are commonly shoulder-width apart for middle irons and putters, narrower for short irons and wider for long irons and woods. The ball is typically positioned more to the "front" of the player's stance (closer to the leading foot) for lower-lofted clubs, with the usual ball position for a drive being just behind the arch of the leading foot. The ball is placed further "back" in the player's stance (toward the trailing foot) as the loft of the club to be used increases. Most iron shots and putts are made with the ball roughly centered in the stance, while a few mid- and short-iron shots are made with the ball slightly behind the centre of the stance to ensure consistent contact between the ball and clubface, so the ball is on its way before the club continues down into the turf. Musculature A golf stroke uses the muscles of the core (especially erector spinae muscles and latissimus dorsi muscle when turning), hamstring, shoulder, and wrist. Stronger muscles in the wrist can prevent them from being twisted during swings, whilst stronger shoulders increase the turning force. Weak wrists can also transmit the force to elbows and even neck and lead to injury. (When a muscle contracts, it pulls equally from both ends and, to have movement at only one end of the muscle, other muscles must come into play to stabilize the bone to which the other end of the muscle is attached.) Golf is a unilateral exercise that can break body balances, requiring exercises to keep the balance in muscles. Types of putting Putting is considered to be the most important component of the game of golf. As the game of golf has evolved, there have been many different putting techniques and grips that have been devised to give golfers the best chance to make putts. When the game originated, golfers would putt with their dominate hand on the bottom of the grip and their weak hand on top of the grip. This grip and putting style is known as "conventional". There are many variations of conventional including overlap, where the golfer overlaps the off hand index finger onto off the dominant pinky; interlock, where the offhand index finger interlocks with the dominant pinky and ring finger; double or triple overlap and so on. Recently, "cross handed" putting has become a popular trend amongst professional golfers and amateurs. Cross handed putting is the idea that the dominant hand is on top of the grip where the weak hand is on the bottom. This grip restricts the motion in your dominant hand and eliminates the possibility of wrist breakdowns through the putting stroke. Other notable putting styles include "the claw", a style that has the grip directly in between the thumb and index finger of the dominant hand while the palm faces the target. The weak hand placed normally on the putter. Anchored putting, a style that requires a longer putter shaft that can be anchored into the players stomach or below the chin; the idea is to stabilize one end of the putter thus creating a more consistent pendulum stroke. This style will be banned in 2016 on the professional circuits. Par A hole is classified by its par, meaning the number of strokes a skilled golfer should require to complete play of the hole. The minimum par of any hole is 3 because par always includes a stroke for the tee shot and two putts. Pars of 4 and 5 strokes are ubiquitous on golf courses; more rarely, a few courses feature par-6 and even par-7 holes. Strokes other than the tee shot and putts are expected to be made from the fairway; for example, a skilled golfer expects to reach the green on a par-4 hole in two strokes—one from the tee (the "drive") and another, second, stroke to the green (the "approach")—and then roll the ball into the hole in two putts for par. Putting the ball on the green with two strokes remaining for putts is called making "green in regulation" or GIR. Missing a GIR does not necessarily mean a golfer will not make par, but it does make doing so more difficult as it reduces the number of putts available; conversely, making a GIR does not guarantee a par, as the player might require three or more putts to "hole out". Professional golfers typically make between 60% and 70% of greens in regulation. The primary factor for classifying the par of a relatively straight, hazard-free hole is the distance from the tee to the green, and they can vary between tournament play and casual play. A typical casual play par-3 hole is less than 250 yards (230 m) in length, with a par-4 hole ranging between 251–450 yards (230–411 m), and a par-5 hole being longer than 450 yards (410 m). The rare par-6s can stretch well over 650 yards (590 m). These distances are based on the typical scratch golfer's drive distance of between 240 and 280 yards (220 and 260 m); a green further than the average player's drive will require additional shots from the fairway. However, other considerations must be taken into account; the key question is "how many strokes would a scratch golfer take to make the green by playing along the fairway?". The grade of the land from the tee to the hole might increase or decrease the carry and rolling distance of shots as measured linearly along the ground. Sharp turns or hazards may require golfers to "lay up" on the fairway in order to change direction or hit over the hazard with their next shot. These design considerations will affect how even a scratch golfer would play the hole, irrespective of total distance from tee to green, and must be included in a determination of par. However, a par score never includes "expected" penalty strokes, as a scratch player is never "expected" to hit a ball into a water hazard or other unplayable situation. So the placement of hazards only affect par when considering how a scratch golfer would avoid them. Eighteen-hole courses typically total to an overall par score of 72 for a complete round; this is based on an average par of 4 for every hole, and so is often arrived at by designing a course with an equal number of par-5 and par-3 holes, the rest being par-4. Many combinations exist that total to par-72, and other course pars exist from 68 up to 76, and are not less worthy than courses of par-72. Additionally, in some countries including the United States, courses are classified according to their play difficulty, which may be used to calculate a golfer's playing handicap for a given course. The two primary difficulty ratings in the U.S. are the Course Rating, which is effectively the expected score for a zero-handicap "scratch golfer" playing the course (and may differ from the course par), and the Slope Rating, which is a measure of how much worse a "bogey golfer" (with an 18 handicap) would be expected to play than a "scratch golfer". These two numbers are available for any USGA-sanctioned course, and are used in a weighted system to calculate handicaps (see below). The overall par score in a tournament is the summation of all the par scores in each round. A typical four-round professional tournament played on a par-72 course has a tournament par of 288. Basic forms of golf There are two basic forms of golf play, match play and stroke play. Stroke play is more popular. Match play Two players (or two teams) play each hole as a separate contest against each other in what is called match play. The party with the lower score wins that hole, or if the scores of both players or teams are equal the hole is "halved" (or tied). The game is won by the party that wins more holes than the other. In the case that one team or player has taken a lead that cannot be overcome in the number of holes remaining to be played, the match is deemed to be won by the party in the lead, and the remainder of the holes are not played. For example, if one party already has a lead of six holes, and only five holes remain to be played on the course, the match is over and the winning party is deemed to have won "6 & 5". At any given point, if the lead is equal to the number of holes remaining, the party leading the match is said to be "dormie", and the match is continued until the party increases the lead by one hole or ties any of the remaining holes, thereby winning the match, or until the match ends in a tie with the lead player's opponent winning all remaining holes. When the game is tied after the predetermined number of holes have been played, it may be continued until one side takes a one-hole lead. Stroke play The score achieved for each and every hole of the round or tournament is added to produce the total score, and the player with the lowest score wins in stroke play. Stroke play is the game most commonly played by professional golfers. If there is a tie after the regulation number of holes in a professional tournament, a playoff takes place between all tied players. Playoffs either are sudden death or employ a pre-determined number of holes, anywhere from three to a full 18. In sudden death, a player who scores lower on a hole than all of his opponents wins the match. If at least two players remain tied after such a playoff using a pre-determined number of holes, then play continues in sudden death format, where the first player to win a hole wins the tournament. Other forms of play The other forms of play in the game of golf are bogey competition, skins, 9-points, stableford, team play, and unofficial team variations. Bogey competition A bogey competition is a scoring format sometimes seen in informal tournaments. Its scoring is similar to match play, except each player compares their hole score to the hole's par rating instead of the score of another player. The player "wins" the hole if they score a birdie or better, they "lose" the hole if they score a bogey or worse, and they "halve" the hole by scoring par. By recording only this simple win-loss-halve score on the sheet, a player can shrug off a very poorly-played hole with a simple "-" mark and move on. As used in competitions, the player or pair with the best win-loss "differential" wins the competition. Skins The Skins Game is a variation on the match play where each hole has an amount of money (called "skin") attached to it. The lump sum may be prize money at the professional level (the most famous event to use these rules was the "LG Skins Game", played at Indian Wells Golf Resort in California until 2008), or an amount wagered for each hole among amateur players. The player with the lowest score on the hole wins the skin for that hole; if two or more players tie for the lowest score, the skin carries over to the next hole. The game continues until a player wins a hole outright, which may (and evidently often does) result in a player receiving money for a previous hole that they had not tied for. If players tie the 18th hole, either all players or only the tying players repeat the 18th hole until an outright winner is decided for that hole—and all undecided skins. 9-Points A nine-point game is another variant of match play typically played among threesomes, where each hole is worth a total of nine points. The player with the lowest score on a hole receives five points, the next-lowest score 3 and the next-lowest score 1. Ties are generally resolved by summing the points contested and dividing them among the tying players; a two-way tie for first is worth four points to both players, a two-way tie for second is worth two points to both players, and a three-way tie is worth three points to each player. The player with the highest score after 18 holes (in which there are 162 points to be awarded) wins the game. This format can be used to wager on the game systematically; players each contribute the same amount of money to the pot, and a dollar value is assigned to each point scored (or each point after 18) based on the amount of money in the pot, with any overage going to the overall winner. Stableford The Stableford system is a simplification of stroke play that awards players points based on their score relative to the hole's par; the score for a hole is calculated by taking the par score, adding 2, then subtracting the player's hole score, making the result zero if negative. Alternately stated, a double bogey or worse is zero points, a bogey is worth one point, par is two, a birdie three, an eagle four, and so on. The advantages of this system over stroke play are a more natural "higher is better" scoring, the ability to compare Stableford scores between plays on courses with different total par scores (scoring an "even" in stroke play will always give a Stableford score of 36), discouraging the tendency to abandon the entire game after playing a particularly bad hole (a novice playing by strict rules may score as high as an 8 or 10 on a single difficult hole; their Stableford score for the hole would be zero, which puts them only two points behind par no matter how badly they played), and the ability to simply pick up one's ball once it is impossible to score any points for the hole, which speeds play. The USGA and R&A sanction a "Modified Stableford" system for scratch players, which makes par worth zero, a birdie worth 2, eagle 5 and double-eagle 8, while a bogey is a penalty of −1 and a double-bogey or worse −3. As with the original system, the highest score wins the game, and terrible scores on one or two holes won't wreck an entire game, but this system rewards "bogey-birdie" play more than the original, encouraging golfers to try to make the riskier birdie putt or eagle chipshot instead of simply parring each hole. Handicap systems A handicap is a numerical measure of an amateur golfer's ability to play golf over the course of 18 holes. A player's handicap generally represents the number of strokes above par that the player will make over the course of an above-average round of golf. The better the player the lower their handicap is. Someone with a handicap of 0 or less is often called a scratch golfer, and would typically score or beat the course par on a round of play (depending on course difficulty). Calculating a handicap is often complicated, the general reason being that golf courses are not uniformly challenging from course to course or between skill levels. A player scoring even par on Course A might average four over par on course B, while a player averaging 20 over par on course A might average only 16 over on course B. So, to the "scratch golfer", Course B is more difficult, but to the "bogey golfer", Course A is more difficult. The reasons for this are inherent in the types of challenges presented by the same course to both golfers. Distance is often a problem for amateur "bogey" golfers with slower swing speeds, who get less distance with each club, and so typically require more shots to get to the green, raising their score compared to a scratch golfer with a stronger swing. However, courses are often designed with hazard placement to mitigate this advantage, forcing the scratch player to "lay up" to avoid bunkers or water, while the bogey golfer is more or less unaffected as the hazard lies out of their range. Finally, terrain features and fairway maintenance can affect golfers of all skill levels; narrowing the fairway by adding obstacles or widening the rough on each side will typically increase the percentage of shots made from disadvantageous lies, increasing the challenge for all players. By USGA rules, handicap calculation first requires calculating a "Handicap Differential" for each round of play the player has completed by strict rules. That in itself is a function of the player's "gross adjusted score" (adjustments can be made to mitigate various deviations either from strict rules or from a player's normal capabilities, for handicap purposes only) and two course-specific difficulty ratings: the Course Rating, a calculated expected score for a hypothetical "scratch golfer": and the Slope Rating, a number based on how much worse a hypothetical 20-handicap "bogey golfer" would score compared to the "scratch golfer". The average Slope Rating of all USGA-rated courses as of 2012 is 113, which also factors into the Differential computation. The most recent Differentials are logged, up to 20 of them, and then the best of these (the number used depends on the number available) are selected, averaged, multiplied by .96 (an "excellence factor" that reduces the handicap of higher-scoring players, encouraging them to play better and thus lower their handicap), and truncated to the tenths place to produce the "Handicap Index". Additional calculations can be used to place higher significance on a player's recent tournament scores. A player's Handicap Index is then multiplied by the Slope Rating of the course to be played, divided by the average Slope Rating of 113, then rounded to the nearest integer to produce the player's Course Handicap. Once calculated, the Course Handicap is applied in stroke play by simply reducing the player's gross score by the handicap, to produce a net score. So, a gross score of 96 with a handicap of 22 would produce a net score of 74. In match play, the lower handicap is subtracted from the higher handicap, and the resulting handicap strokes are awarded to the higher handicapper by distributing them among the holes according to each hole's difficulty; holes are ranked on the scorecard from 1 to 18 (or however many holes are available), and one stroke is applied to each hole from the most difficult to the least difficult. So, if one player has a 9 handicap and another has a 25 handicap, the 25-handicap player receives one handicap stroke on each of the most difficult 16 holes (25-9). If the 25-handicapper were playing against a "scratch golfer" (zero handicap), all 25 strokes would be distributed, first by applying one stroke to each hole, then applying the remaining strokes, one each, to the most difficult 7 holes; so, the handicap player would subtract 2 strokes from each of the most difficult 7 holes, and 1 each from the remaining 11. Handicap systems have potential for abuse by players who may intentionally play badly to increase their handicap ("throwing their 'cap") before playing to their potential at an important event with a valuable prize. For this reason, professional golf associations do not use them, but they can be calculated and used along with other criteria to determine the relative strengths of various professional players. Touring professionals, being the best of the best, often have negative handicaps; they can be expected, on average, to score lower than the Course Rating on any course. Popularity In 2005 Golf Digest calculated that the countries with most golf courses per capita, in order, were: Scotland, New Zealand, Australia, Ireland, Canada, Wales, United States, Sweden, and England (countries with fewer than 500,000 people were excluded). The number of courses in other territories has increased, an example of this being the expansion of golf in China. The first golf course in China opened in 1984, but by the end of 2009 there were roughly 600 in the country. For much of the 21st century, development of new golf courses in China has been officially banned (with the exception of the island province of Hainan), but the number of courses had nonetheless tripled from 2004 to 2009; the "ban" has been evaded with the government's tacit approval simply by not mentioning golf in any development plans. In the United States, the number of people who play golf twenty-five times or more per year decreased from 6.9 million in 2000 to 4.6 million in 2005, according to the National Golf Foundation. The NGF reported that the number who played golf at all decreased from 30 to 26 million over the same period. In February 1971, astronaut Alan Shepard became the first person to golf anywhere other than Earth. He smuggled a golf club and two golf balls on board Apollo 14 with the intent to golf on the Moon. He attempted two drives. He shanked the first attempt, but it is estimated his second went more than 200 yards (180 m). Professional golf The majority of professional golfers work as club or teaching professionals ("pros"), and only compete in local competitions. A small elite of professional golfers are "tournament pros" who compete full-time on international "tours". Many club and teaching professionals working in the golf industry start as caddies or with a general interest in the game, finding employment at golf courses and eventually moving on to certifications in their chosen profession. These programs include independent institutions and universities, and those that eventually lead to a Class A golf professional certification. Touring professionals typically start as amateur players, who attain their "pro" status after success in major tournaments that win them either prize money and/or notice from corporate sponsors. Jack Nicklaus, for example, gained widespread notice by finishing second in the 1960 U.S. Open to champion Arnold Palmer, with a 72-hole score of 282 (the best score to date in that tournament by an amateur). He played one more amateur year in 1961, winning that year's U.S. Amateur Championship, before turning pro in 1962. Instruction Golf instruction involves the teaching and learning of the game of golf. Proficiency in teaching golf instruction requires not only technical and physical ability but also knowledge of the rules and etiquette of the game. In some countries, golf instruction is best performed by teachers certified by the Professional Golfers Association. Some top instructors who work with professional golfers have become quite well known in their own right. Professional golf instructors can use physical conditioning, mental visualization, classroom sessions, club fitting, driving range instruction, on-course play under real conditions, and review of videotaped swings in slow motion to teach golf to prepare the golfer for the course. Golf tours There are at least twenty professional golf tours, each run by a PGA or an independent tour organization, which is responsible for arranging events, finding sponsors, and regulating the tour. Typically a tour has "members" who are entitled to compete in most of its events, and also invites non-members to compete in some of them. Gaining membership of an elite tour is highly competitive, and most professional golfers never achieve it. Perhaps the most widely known tour is the PGA Tour, which tends to attract the strongest fields, outside the four Majors and the four World Golf Championships events. This is due mostly to the fact that most PGA Tour events have a first prize of at least 800,000 USD. The European Tour, which attracts a substantial number of top golfers from outside North America, ranks second to the PGA Tour in worldwide prestige. Some top professionals from outside North America play enough tournaments to maintain membership on both the PGA Tour and European Tour. Since 2010, both tours' money titles have been claimed by the same individual three times, with Luke Donald doing so in 2011 and Rory McIlroy in 2012 and 2014. In 2013, Henrik Stenson won the FedEx Cup points race on the PGA Tour and the European Tour money title, but did not top the PGA Tour money list (that honour going to Tiger Woods). The other leading men's tours include the Japan Golf Tour, the Asian Tour (Asia outside Japan), the PGA Tour of Australasia, and the Sunshine Tour (for southern Africa, primarily South Africa). The Japan, Australasian, Sunshine, PGA, and European Tours are the charter members of the trade body of the world's main tours, the International Federation of PGA Tours, founded in 1996. The Asian Tour became a full member in 1999. The Canadian Tour became an associate member of the Federation in 2000, and the Tour de las Américas (Latin America) became an associate member of the Federation in 2007. The Federation underwent a major expansion in 2009 that saw eleven new tours become full members – the Canadian Tour, Tour de las Américas, China Golf Association, the Korea Professional Golfers' Association, Professional Golf Tour of India, and the operators of all six major women's tours worldwide. The OneAsia Tour, founded in 2009, is not a member of the Federation, but was founded as a joint venture of the Australasia, China, Japan, and Korean tours. In 2011, the Tour de las Américas was effectively taken over by the PGA Tour, and in 2012 was folded into the new PGA Tour Latinoamérica. Also in 2012, the Canadian Tour was renamed PGA Tour Canada after it agreed to be taken over by the PGA Tour. All men's tours that are Federation members, except the India tour, offer points in the Official World Golf Ranking (OWGR) to players who place sufficiently high in their events. The OneAsia Tour also offers ranking points. Golf is unique in having lucrative competition for older players. There are several senior tours for men aged fifty and over, arguably the best known of which is the U.S.-based PGA Tour Champions. There are six principal tours for women, each based in a different country or continent. The most prestigious of these is the United States-based LPGA Tour. All of the principal tours offer points in the Women's World Golf Rankings for high finishers in their events. All of the leading professional tours for under-50 players have an official developmental tour, in which the leading players at the end of the season will earn a tour card on the main tour for the following season. Examples include the Web.com Tour, which feeds to the PGA Tour, and the Challenge Tour, which is the developmental tour of the European Tour. The Web.com and Challenge Tours also offer OWGR points. Men's major championships The major championships are the four most prestigious men's tournaments of the year. In chronological order they are: The Masters, the U.S. Open, The Open Championship (referred to in North America as the British Open) and the PGA Championship. The fields for these events include the top several dozen golfers from all over the world. The Masters has been played at Augusta National Golf Club in Augusta, Georgia, since its inception in 1934. It is the only major championship that is played at the same course each year. The U.S. Open and PGA Championship are played at courses around the United States, while the Open Championship is played at courses around the United Kingdom. Prior to the advent of the PGA Championship and The Masters, the four Majors were the U.S. Open, the U.S. Amateur, the Open Championship, and the British Amateur. Women's major championships Women's golf does not have a globally agreed set of majors. The list of majors recognised by the dominant women's tour, the LPGA Tour in the U.S., has changed several times over the years, with the most recent changes occurring in 2001 and 2013. Like the PGA Tour, the (U.S.) LPGA tour long had four majors, but now has five: the ANA Inspiration (previously known by several other names, most recently the Kraft Nabisco Championship), the Women's PGA Championship (previously known as the LPGA Championship), the U.S. Women's Open, the Women's British Open (which replaced the du Maurier Classic as a major in 2001) and The Evian Championship (added as the fifth major in 2013). Only the last two are also recognised as majors by the Ladies European Tour. However, the significance of this is limited, as the LPGA is far more dominant in women's golf than the PGA Tour is in mainstream men's golf. For example, the BBC has been known to use the U.S. definition of "women's majors" without qualifying it. Also, the Ladies' Golf Union, the governing body for women's golf in Great Britain and Ireland, stated on its official website that the Women's British Open was "the only Women's Major to be played outside the U.S." (this was before the elevation of The Evian Championship to major status). For many years, the Ladies European Tour tacitly acknowledged the dominance of the LPGA Tour by not scheduling any of its own events to conflict with the three LPGA majors played in the U.S., but that changed beginning in 2008, when the LET scheduled an event opposite the LPGA Championship. The second-richest women's tour, the LPGA of Japan Tour, does not recognise any of the U.S. LPGA or European majors as it has its own set of majors (historically three, since 2008 four). However, these events attract little notice outside Japan. Senior major championships Senior (aged fifty and over) men's golf does not have a globally agreed set of majors. The list of senior majors on the U.S.-based PGA Tour Champions has changed over the years, but always by expansion. PGA Tour Champions now recognises five majors: the Senior PGA Championship, The Tradition, the Senior Players Championship, the United States Senior Open, and The Senior (British) Open Championship. Of the five events, the Senior PGA is by far the oldest, having been founded in 1937. The other events all date from the 1980s, when senior golf became a commercial success as the first golf stars of the television era, such as Arnold Palmer and Gary Player, reached the relevant age. The Senior Open Championship was not recognised as a major by PGA Tour Champions until 2003. The European Senior Tour recognises only the Senior PGA and the two Senior Opens as majors. However, PGA Tour Champions is arguably more dominant in global senior golf than the U.S. LPGA is in global women's golf. Olympic Games After a 112-year absence from the Olympic Games, golf returned for the 2016 Rio Games. 41 different countries were represented by 120 athletes. Women It wasn't until 1552 that the first woman golfer played the game. Mary Queen of Scots commissioned St. Andrew's Links. However, it wasn't until the 20th century that women were taken seriously and eventually broke the "Gentlemen Only, Ladies Forbidden" rule. Many men saw women as unfit to play the sport due to their lack of strength and ability. In the United States, 1891 was a pivotal year for ladies golf because the Shinnecock Hills nine-hole course was built in Southampton, New York, for women and was the first club to offer membership to women golfers. Four years later, in 1895, The U.S. Golf Association held the first Women's Amateur Championship tournament. Just like professional golfer Bobby Jones, Joyce Wethered was considered to be a star in the 1920s. Jones praised Wethered in 1930 after they had played an exhibition against each other. He doubted that there had ever been a better golfer, man or woman. However, Bobby Jones' comment wasn't enough for others to change their views on women golfers. The Royal Liverpool's club refused entry of Sir Henry Cotton's wife into the clubhouse in the late 1940s. The secretary of the club released a statement saying, "No woman ever has entered the clubhouse and, praise God, no woman ever will." However, American golfer and all-around athlete, Babe Zaharias didn't have to enter the clubhouse. She was able to prove herself on the course, going on to become the first American to win the British Women's Amateur title in 1947. The following year she became the first woman to attempt to qualify for the U.S. Open, but her application was rejected by the USGA. They stated that the event was intended to be open to men only. The Ladies Professional Golf Association was formed in 1950 as a way to popularize the sport and provide competitive opportunities for golfers. The competitions were not the same for the men and women. It wasn't until 1972 that U.S. Congress passed the Title IX of the Education Amendments. "No person in the United States shall, on the basis of sex, be excluded from participation in, be denied the benefits of, or be subject to discrimination under any education program or activities receiving Federal financial assistance." American Renee Powell moved to the UK in the 1970s to further her career, and became the first woman to play in a British men's tournament in 1977. Today, women golfers are still fighting and working hard to have the same opportunities as men golfers. There is still a big pay gap in the USGA. The USGA has a long history of writing bigger checks to winners of the men's U.S. Open than the U.S. Women's Open.
<filename>gameplay_ability_system.h #pragma once #include "gameplay_node.h" #include "gameplay_tags.h" #include <core/hash_map.h> #include <core/vector.h> #include <cmath> #include <random> class GameplayEffect; class GameplayEffectCue; class GameplayEffectModifier; class GameplayAbility; class GameplayAttribute; class GameplayAttributeData; class GameplayAttributeSet; class GameplayAbilitySystem; namespace UpdateAttributeOperation { enum Type { /** Won't update current value. */ None, /** Will multiply current value with relative change of base value. */ Relative, /** Will add delta of base change to current value. */ Absolute, /** Overrides current value to new base value. */ Override }; } class GAMEPLAY_ABILITIES_API GameplayEvent : public GameplayResource { GDCLASS(GameplayEvent, GameplayResource); OBJ_CATEGORY("GameplayAbilities"); public: virtual ~GameplayEvent() = default; void set_event_tag(const String &value); const String &get_event_tag() const; void add_event_target(Node *target); const Array &get_event_targets() const; private: /** Tag identifying this event. */ String event_tag; /** Event target is the one being targeted by the event. */ Array event_targets; static void _bind_methods(); }; class GAMEPLAY_ABILITIES_API GameplayEffectNode : public GameplayNode { GDCLASS(GameplayEffectNode, GameplayNode); OBJ_CATEGORY("GameplayAbilities"); public: virtual ~GameplayEffectNode() = default; void initialise(GameplayAbilitySystem *source, GameplayAbilitySystem *target, const Ref<GameplayEffect> &effect, int64_t level, double normalised_level); Node *get_source() const; Node *get_target() const; Ref<GameplayEffect> get_effect() const; double get_duration() const; int64_t get_stacks() const; int64_t get_level() const; double get_normalised_level() const; void add_stack(int64_t value); void remove_stack(int64_t value); void effect_process(double delta); void set_effect_process(bool value); protected: void _notification(int notification); private: GameplayAbilitySystem *source = nullptr; GameplayAbilitySystem *target = nullptr; Ref<GameplayEffect> effect; int64_t level = 1; int64_t previous_stack = 1; double normalised_level = 1; double duration = 0; double period = 0; bool stack_overflow = false; bool stack_applied = false; bool should_effect_process = true; int64_t internal_stacks = 1; Vector<GameplayAbility *> granted_abilities; double calculate_duration() const; double calculate_period_threshold() const; void apply_effect(const Ref<GameplayEffect> &effect); void apply_effects(const Array &effects); void execute_effect(); GameplayAbilitySystem *get_stacking_system() const; void start_effect(); void end_effect(bool cancelled); static void _bind_methods(); }; /** * Gameplay ability system is the core processing node if the system, it handles all abilities, inactive or active, active effects and state changes. * Additionally the system has the following signals which it will emit. * gameplay_cue_activated(source, cue_tag, level, magnitude, persistent) * gameplay_cue_removed(source, cue_tag) * * gameplay_effect_activated(source, effect) * gameplay_effect_infliction_failed(source, effect) * gameplay_effect_removal_failed(source, effect) * gameplay_effect_ended(source, effect, cancelled) * * gameplay_ability_activated(source, ability) * gameplay_ability_cancelled(source, ability) * gameplay_ability_blocked(source, ability) * gameplay_ability_ready(source, ability) * * gameplay_attribute_changed(source, attribute, old_value) * gameplay_base_attribute_changed(source, attribute, old_base, old_value) */ class GAMEPLAY_ABILITIES_API GameplayAbilitySystem : public GameplayNode { GDCLASS(GameplayAbilitySystem, GameplayNode); OBJ_CATEGORY("GameplayAbilities"); friend class GameplayEffectNode; friend class GameplayAbility; public: GameplayAbilitySystem(); virtual ~GameplayAbilitySystem() = default; /** Gets all currently active and owned tags. */ const Ref<GameplayAttributeSet> &get_attributes() const; /** Gets all currently active and owned tags. */ Ref<GameplayTagContainer> get_active_tags() const; /** Gets abilities from current system. */ GameplayAbility *get_ability_by_name(const StringName &name) const; GameplayAbility *get_ability_by_index(int64_t index) const; int64_t get_ability_count() const; /** Gets all active abilities. */ Array get_active_abilities() const; /** Intended for internal usage. */ const Vector<GameplayAbility *> &get_abilities_vector() const; const Vector<GameplayAbility *> &get_active_abilities_vector() const; /** Gets all active effects on this target. */ const Ref<GameplayTagContainer> &get_persistent_cues() const; /** Queries active effects and returns those which match the given tag. */ Array query_active_effects_by_tag(const String &tag) const; /** Queries active effects and returns those with at least one of the given tags. */ Array query_active_effects(const Ref<GameplayTagContainer> &tags) const; /** Gets remaining duration left on active effect. */ double get_remaining_effect_duration(const Ref<GameplayEffect> &effect) const; /** Returns true if this ability system triggered any abilities via the given event. */ bool handle_event(const Ref<GameplayEvent> &event); /** Checks if an attribute is present in this ability system. */ bool has_attribute(const StringName &name) const; /** Gets attribute data from this ability system or null if no such attribute exists. */ Ref<GameplayAttribute> get_attribute(const StringName &name) const; Ref<GameplayAttributeData> get_attribute_data(const StringName &name) const; double get_base_attribute_value(const StringName &name) const; double get_current_attribute_value(const StringName &name) const; /** Updates base value of attribute. */ bool update_base_attribute(const StringName &name, double value, UpdateAttributeOperation::Type operation = UpdateAttributeOperation::None); /** Adds tags. */ void add_tag(const String &tag); void add_tags(const Ref<GameplayTagContainer> &tags); /** Removes tags. */ void remove_tag(const String &tag); void remove_tags(const Ref<GameplayTagContainer> &tags); /** Adds a single ability to this instance. */ void add_ability(Node *ability); void add_abilities(const Array &abilities); /** Removes a single ability to this instance. */ void remove_ability(Node *ability); void remove_abilities(const Array &abilities); /** Tries to activate ability. */ void activate_ability(Node *ability); /** Tries to activate ability. */ void cancel_ability(Node *ability); /** Checks if a single effect is applicable. */ bool can_apply_effect(Node *source, const Ref<GameplayEffect> &effect, int64_t stacks = 1, int64_t level = 1, double normalised_level = 1) const; /** Filters effects and returns array of those applicable. */ Array filter_effects(Node *source, const Array &effects) const; /** Tries to apply given effect and returns success. */ bool try_apply_effect(Node *source, const Ref<GameplayEffect> &effect, int64_t stacks = 1, int64_t level = 1, double normalised_level = 1); /** Adds a single effect from source to this instance. */ void apply_effect(Node *source, const Ref<GameplayEffect> &effect, int64_t stacks = 1, int64_t level = 1, double normalised_level = 1); void apply_effects(Node *source, const Array &effects, int64_t stacks = 1, int64_t level = 1, double normalised_level = 1); /** Adds a single effect from source to this instance. */ void remove_effect(Node *source, const Ref<GameplayEffect> &effect, int64_t stacks = 1, int64_t level = 1); void remove_effect_node(Node *source, Node *effect_node, int64_t stacks = 1, int64_t level = 1); /** Adds a single cue to this target. */ void apply_cue(const String &cue, double level = 1, double magnitude = 0, bool persistent = false); /** Removes a cue from the system. */ void remove_cue(const String &cue); /** Gets current stack count of given effect. */ int64_t get_stack_count(const Ref<GameplayEffect> &effect) const; /** Gets current stack level for given effect. */ int64_t get_stack_level(const Ref<GameplayEffect> &effect) const; void set_attribute_set(const Ref<GameplayAttributeSet> &value); const Ref<GameplayAttributeSet> &get_attribute_set() const; /** Targeting */ void add_target(Node *target); void remove_target(Node *target); void set_targets(const Array &value); const Array &get_targets() const; protected: void _notification(int notification); private: struct ActiveEffectEntry { GameplayEffectNode *effect_node = nullptr; int64_t level = 1; int64_t stacks = 1; }; HashMap<StringName, ActiveEffectEntry> effect_stacking; Array targets; Ref<GameplayAttributeSet> attributes; Ref<GameplayTagContainer> persistent_cues = make_reference<GameplayTagContainer>(); Ref<GameplayTagContainer> active_tags = make_reference<GameplayTagContainer>(); Vector<GameplayAbility *> abilities; Vector<GameplayAbility *> active_abilities; Vector<GameplayEffectNode *> active_effects; static std::random_device rdevice; static std::default_random_engine rengine; static std::uniform_real_distribution<double> rgenerator; void execute_effect(GameplayEffectNode *node); void apply_modifiers(GameplayEffectNode *node, const Array &modifiers); void add_active_ability(GameplayAbility *ability); void remove_active_ability(GameplayAbility *ability); void add_effect(GameplayAbilitySystem *source, const Ref<GameplayEffect> &effect, int64_t stacks, int64_t level, double normalised_level); static double execute_magnitude(double magnitude, double current_value, int operation); static void _bind_methods(); };
This invention relates to a vacuum switch including windmill-shaped electrodes therein. FIG. 12 is a sectional view showing the overall structure of a vacuum switch which has a pair of contacts hermetically sealed within a highly evacuated vacuum vessel. An insulating cylinder 21 has attached to its opposite ends end plates 22a and 22b to constitute a vacuum vessel 23 of which an inner portion is highly evacuated. Opposingly disposed within the vacuum vessel 23 are a stationary electrode 1a secured to a tip of a stationary electrode rod 24a extending through one of the end plates 22a and a movable electrode 1b secured to a tip of a movable electrode rod 24b extending through the other of the end plates 22b. A bellows 25 is disposed across the movable electrode rode 24b and the end plate 22b. The bellows 25 allows the movable electrode rod 24b connected to an operating device (not shown) to be driven to move the movable electrode rod 24b in the axial direction. This movement of the movable electrode rod 24b causes the electrode 1a of the stationary side and the electrode 1b on the movable side to be brought into and out of electrical contact. In order to prevent metal vapor diffused from the arc generated across the electrodes 1a and 1b from depositing on the inner wall surface of the vacuum vessel 23, a shield 26 is mounted to the inner wall surface of the insulating cylinder 21 by a shield support 27. The electrodes 1a and 1b of such vacuum switch have the same configuration, which are windmill type with grooves in the electrode. By the provision of these grooves, the electrical path in the electrode is limited to define an electrical path of a reciprocating loop-shape extending in the circumferential direction, whereby the arc is driven by a magnetic field to move along the circumference of the electrode, so that the arc is prevented from staying at one position to avoid a local melting of the electrode, thus improving the interrupting performance. Also, in order to obtain a strong magnetic drive force immediately after the arc generation, the structure has the arc-running surface and contact surface in accordance with each other. FIGS. 13 to 16 illustrate a structure of a windmill-shaped electrode of a conventional vacuum switch tube disclosed for example in Japanese Patent Laid-Open No. 4-368734, FIGS. 13 and 15 being plan views and FIGS. 14 and 16 being side views. In the figures, the electrode rod 24 (the stationary electrode rod 24a or the movable electrode rod 24b ) have thereon a windmill-shaped electrode 1 (the stationary side electrode 1a or the movable side electrode 1b). The windmill-shaped electrode 1 is integrally comprised of an auxiliary electrode 31 and a ring-shaped electrode 32. The auxiliary electrode 31 comprises a central portion 33 mounted to an end portion of the electrode rod 24, a plurality of arms 34 disposed to the central portion 33 in a windmill-shape manner or Buddhist cross-shape and extending in an arc from the central portion 33 toward the outer circumferential portion, and connecting portion 35 disposed at each of the tips of the plurality of the arms 34. The ring-shaped electrode 32 has an annular shape with its width substantially equal to the width of the arms 34 of the auxiliary electrode 31 and the ring-shaped electrode 32 is connected to the connecting portions 35. In such an arrangement, when the windmill-shaped electrodes 1 (the stationary side electrode 1a and the movable side electrode 1b) are separated, an electric arc generates at the contacting surface of the ring-shaped electrode 32. When the arc generates at the point A of FIGS. 15 and 16, for example, an electric current 11 flowing through the arms 34 of the auxiliary electrode 31 generates a magnetic drive force F in the circumferential direction of the ring-shaped electrode 32, whereby the arc is driven to rotate around the outer circumference of the ring-shaped electrode 32. Also, when the arc generates at the position which is not the connecting portions 35, such as the point E of FIGS. 15 and 16, for example, a magnetic drive force in the circumferential direction of the ring-shaped electrode 32 is also generated by an electric current 12 flowing into the ring-shaped electrode 32 from the arms 34 of the auxiliary electrode 31. Therefore, the arc is rotated along the ring-shaped electrode 32. As has been described, in the conventional windmill-shaped electrode 1, the arc generates at the ring-shaped electrode 32 and the arc is magnetically driven immediately after the arc generation. As a result of this, the local temperature rises at the windmill-shaped electrodes 1 due to the arc before it is magnetically driven after the arc generation, thus improving the interrupting performance. In the windmill-shaped electrode 1 of the above-described conventional vacuum switch tube, when an electric arc is generated at a point E1 of FIG. 17, for example, between the neighboring connecting portions 35 and 35, in addition to the current I2 flowing into the arc through the arms 34a, an electric current I3 from the arm 34b flows. This current I3 generates a force F3 in the direction of preventing the rotation of the arc, so that the time from the arc generation until the magnetic driving of the arc cannot be made short, not improving the interrupting performance. Accordingly, an object of the present invention is to provide a vacuum switch free from the above-discussed problems of the conventional vacuum switch. Another object of the present invention is to provide a vacuum switch in which an electric arc can be strongly magnetically driven immediately after the arc generation irrespective of the position on the contacting surface between the stationary side electrode and the movable side electrode at which the arc is generated, thereby improving the interrupting performance. With the above objects in view, the present invention resides in a vacuum switch comprising: a pair of windmill-shaped electrodes disposed within a vacuum tube and each having formed therein a plurality of spiral grooves extending from a central portion to a circumferential portion thereof, and including a windmill-shaped portion separated from each other by said grooves and a plurality of contact portions separated by said grooves and having a thickness larger than that of said windmill portion; said windmill-shaped electrodes being arranged such that said contact portions are brought into contact with each other when said pair of windmill-shaped electrodes are closed, an electric arc is generated on said contact portions when said pair of windmill-shaped electrodes are separated from each other, a magnetic flux is generated by an electric current flowing into the electric arc from said windmill portion, and that a component parallel to a contact surface of said magnetic flux and serving as an arc driving force with respect to a range of 0.5 mm from the contacting surface contacting with said contacting portion of the leg portion of said arc has a magnetic flux density equal to or larger than 0.01 tesla with respect to an electric current of 1 kA. A ratio of an inner diameter Di of said contact portion to an outer diameter D of said windmill-shaped electrodes may be equal to or greater than 0.4. The difference in thickness between the windmill portions and the contact portions may be equal to or less than 5 mm. Each of said windmill-shaped electrodes may be connected to each of the pair of electrode rods, a ratio of a diameter d of said connection portion of said electrode rod to an inner diameter Di of said contacting portion may be equal to or less than 0.6. The windmill-shaped electrodes may be made of a Cuxe2x80x94Cr material including 20xe2x80x9460 weight % of Cr.
A t(7;12) balanced translocation with breakpoints overlapping those of the WilliamsBeuren and 12q14 microdeletion syndromes The molecular characterization of balanced chromosomal rearrangements have always been of advantage in identifying diseasecausing genes. Here, we describe the breakpoint mapping of a de novo balanced translocation t(7;12)(q11.22;q14.2) in a patient presenting with a failure to thrive associated with moderate mental retardation, facial anomalies, and chronic constipation. The localization of the breakpoints and the cooccurrence of WilliamsBeuren syndrome and 12q14 microdeletion syndrome phenotypes suggested that the expression of some of the dosagesensitive genes of these two segmental aneuploidies were modified in cells of the proposita. However, we were unable to identify chromosomes 7 and/or 12mapping genes that showed disturbed expression in the lymphoblastoids of the proposita. This case showed that positioneffect might operate in some tissues, but not in others. It also illustrates the overlap of phenotypes presented by patients with the recently described 12q14 structural rearrangements. © 2010 WileyLiss, Inc.
Are topical antibiotics an alternative to oral antibiotics for children with acute otitis media and ear discharge? #### What you need to know Acute otitis media (AOM) is a common reason for childhood primary care visits and antibiotic prescription in the United Kingdom.1 2 Many randomised controlled trials (RCTs) have shown that symptoms settle within a few days, irrespective of antibiotic use,3 with one systematic review reporting that ear pain takes eight days to resolve fully in 90% of children.4 However observational data and an individual patient data meta-analysis showed that, among children with AOM, those with ear discharge have a worse prognosis5 and a more prolonged duration of ear pain or fever than those without ear discharge.6 Current guidance from the UK National Institute for Health and Care Excellence recommends that general practitioners consider immediately prescribing oral antibiotics for children presenting with AOM and ear discharge.7 However, oral antibiotics commonly have side effects such as diarrhoea, vomiting, and rashes3 and increase the risk of antimicrobial resistance.8 For children with AOM and ear discharge, topical antibiotics are a possible alternative because they put less selective resistance pressure on bacteria and eardrum perforation allows direct entry of the antibiotic into the middle ear, without exposing children to systemic side effects.9 However the risk of ototoxicity is debated.10 11 To our knowledge, no RCTs or relevant systematic reviews of the effectiveness of topical antibiotics for children with AOM and ear discharge have been published. ### Oral antibiotics By contrast, individual patient data meta-analysis evidence shows that oral antibiotics are more effective than
// Test the ability to reassign proximity properties to a geometry that already // has the proximity role. TEST_F(GeometryStateTest, ModifyProximityProperties) { SetUpSingleSourceTree(); EXPECT_EQ(gs_tester_.proximity_engine().num_geometries(), 0); ProximityProperties empty_props; ProximityProperties props1; props1.AddProperty("prop1", "value", 1); ProximityProperties props2; props2.AddProperty("prop2", "value", 2); AddRigidHydroelasticProperties(1.0, &props2); DRAKE_EXPECT_THROWS_MESSAGE( geometry_state_.AssignRole(source_id_, geometries_[0], empty_props, RoleAssign::kReplace), std::logic_error, "Trying to replace the properties on geometry id \\d+ for the 'proximity'" " role.*"); const auto& hydroelastic_geometries = internal::ProximityEngineTester::hydroelastic_geometries( gs_tester_.proximity_engine()); geometry_state_.AssignRole(source_id_, geometries_[0], props1); EXPECT_EQ(gs_tester_.proximity_engine().num_geometries(), 1); const ProximityProperties* props = geometry_state_.GetProximityProperties(geometries_[0]); EXPECT_NE(props, nullptr); EXPECT_TRUE(props->HasGroup("prop1")); EXPECT_TRUE(props->HasProperty("prop1", "value")); EXPECT_EQ(props->GetProperty<int>("prop1", "value"), props1.GetProperty<int>("prop1", "value")); EXPECT_EQ(hydroelastic_geometries.hydroelastic_type(geometries_[0]), internal::HydroelasticType::kUndefined); DRAKE_EXPECT_NO_THROW(geometry_state_.AssignRole( source_id_, geometries_[0], props2, RoleAssign::kReplace)); EXPECT_EQ(gs_tester_.proximity_engine().num_geometries(), 1); props = geometry_state_.GetProximityProperties(geometries_[0]); EXPECT_NE(props, nullptr); EXPECT_TRUE(props->HasGroup("prop2")); EXPECT_FALSE(props->HasGroup("prop1")); EXPECT_TRUE(props->HasProperty("prop2", "value")); EXPECT_EQ(props->GetProperty<int>("prop2", "value"), props2.GetProperty<int>("prop2", "value")); EXPECT_EQ(hydroelastic_geometries.hydroelastic_type(geometries_[0]), internal::HydroelasticType::kRigid); }
<reponame>feihua666/feihua-framework<gh_stars>1-10 package com.feihua.framework.base.modules.area.po; import feihua.jdbc.api.pojo.BasePo; /** * This class was generated by MyBatis Generator. * @author yangwei 2019-04-24 17:02:21 * Database Table Remarks: * 区域表 * * This class corresponds to the database table base_area * @mbg.generated do_not_delete_during_merge 2019-04-24 17:02:21 */ public class BaseAreaPo extends feihua.jdbc.api.pojo.BaseTreePo<String> { /** * Database Column Remarks: * 名称 * * This field was generated by MyBatis Generator. * This field corresponds to the database column base_area.NAME * * @mbg.generated 2019-04-24 17:02:21 */ private String name; /** * Database Column Remarks: * 区域类型,字典配置 * * This field was generated by MyBatis Generator. * This field corresponds to the database column base_area.TYPE * * @mbg.generated 2019-04-24 17:02:21 */ private String type; /** * Database Column Remarks: * 排序 * * This field was generated by MyBatis Generator. * This field corresponds to the database column base_area.SEQUENCE * * @mbg.generated 2019-04-24 17:02:21 */ private Integer sequence; /** * Database Column Remarks: * 层级 * * This field was generated by MyBatis Generator. * This field corresponds to the database column base_area.LEVEL * * @mbg.generated 2019-04-24 17:02:21 */ private Integer level; /** * Database Column Remarks: * 百度地图经度 * * This field was generated by MyBatis Generator. * This field corresponds to the database column base_area.LONGITUDE * * @mbg.generated 2019-04-24 17:02:21 */ private String longitude; /** * Database Column Remarks: * 百度地图纬度 * * This field was generated by MyBatis Generator. * This field corresponds to the database column base_area.LATITUDE * * @mbg.generated 2019-04-24 17:02:21 */ private String latitude; /** * Database Column Remarks: * 父级 * * This field was generated by MyBatis Generator. * This field corresponds to the database column base_area.PARENT_ID * * @mbg.generated 2019-04-24 17:02:21 */ private String parentId; /** * Database Column Remarks: * LEVEL为1的父id * * This field was generated by MyBatis Generator. * This field corresponds to the database column base_area.PARENT_ID1 * * @mbg.generated 2019-04-24 17:02:21 */ private String parentId1; /** * Database Column Remarks: * LEVEL为2的父id * * This field was generated by MyBatis Generator. * This field corresponds to the database column base_area.PARENT_ID2 * * @mbg.generated 2019-04-24 17:02:21 */ private String parentId2; /** * Database Column Remarks: * LEVEL为3的父id * * This field was generated by MyBatis Generator. * This field corresponds to the database column base_area.PARENT_ID3 * * @mbg.generated 2019-04-24 17:02:21 */ private String parentId3; /** * Database Column Remarks: * LEVEL为4的父id * * This field was generated by MyBatis Generator. * This field corresponds to the database column base_area.PARENT_ID4 * * @mbg.generated 2019-04-24 17:02:21 */ private String parentId4; /** * Database Column Remarks: * LEVEL为5的父id * * This field was generated by MyBatis Generator. * This field corresponds to the database column base_area.PARENT_ID5 * * @mbg.generated 2019-04-24 17:02:21 */ private String parentId5; /** * Database Column Remarks: * LEVEL为6的父id * * This field was generated by MyBatis Generator. * This field corresponds to the database column base_area.PARENT_ID6 * * @mbg.generated 2019-04-24 17:02:21 */ private String parentId6; /** * Database Column Remarks: * LEVEL为7的父id * * This field was generated by MyBatis Generator. * This field corresponds to the database column base_area.PARENT_ID7 * * @mbg.generated 2019-04-24 17:02:21 */ private String parentId7; /** * Database Column Remarks: * LEVEL为8的父id * * This field was generated by MyBatis Generator. * This field corresponds to the database column base_area.PARENT_ID8 * * @mbg.generated 2019-04-24 17:02:21 */ private String parentId8; /** * Database Column Remarks: * LEVEL为9的父id * * This field was generated by MyBatis Generator. * This field corresponds to the database column base_area.PARENT_ID9 * * @mbg.generated 2019-04-24 17:02:21 */ private String parentId9; /** * Database Column Remarks: * LEVEL为10的父id * * This field was generated by MyBatis Generator. * This field corresponds to the database column base_area.PARENT_ID10 * * @mbg.generated 2019-04-24 17:02:21 */ private String parentId10; public String getName() { return name; } public void setName(String name) { this.name = name; } public String getType() { return type; } public void setType(String type) { this.type = type; } public Integer getSequence() { return sequence; } public void setSequence(Integer sequence) { this.sequence = sequence; } public Integer getLevel() { return level; } public void setLevel(Integer level) { this.level = level; } public String getLongitude() { return longitude; } public void setLongitude(String longitude) { this.longitude = longitude; } public String getLatitude() { return latitude; } public void setLatitude(String latitude) { this.latitude = latitude; } public String getParentId() { return parentId; } public void setParentId(String parentId) { this.parentId = parentId; } public String getParentId1() { return parentId1; } public void setParentId1(String parentId1) { this.parentId1 = parentId1; } public String getParentId2() { return parentId2; } public void setParentId2(String parentId2) { this.parentId2 = parentId2; } public String getParentId3() { return parentId3; } public void setParentId3(String parentId3) { this.parentId3 = parentId3; } public String getParentId4() { return parentId4; } public void setParentId4(String parentId4) { this.parentId4 = parentId4; } public String getParentId5() { return parentId5; } public void setParentId5(String parentId5) { this.parentId5 = parentId5; } public String getParentId6() { return parentId6; } public void setParentId6(String parentId6) { this.parentId6 = parentId6; } public String getParentId7() { return parentId7; } public void setParentId7(String parentId7) { this.parentId7 = parentId7; } public String getParentId8() { return parentId8; } public void setParentId8(String parentId8) { this.parentId8 = parentId8; } public String getParentId9() { return parentId9; } public void setParentId9(String parentId9) { this.parentId9 = parentId9; } public String getParentId10() { return parentId10; } public void setParentId10(String parentId10) { this.parentId10 = parentId10; } public com.feihua.framework.base.modules.area.api.ApiBaseAreaPoService service() { return com.feihua.utils.spring.SpringContextHolder.getBean(com.feihua.framework.base.modules.area.api.ApiBaseAreaPoService.class); } @Override public String toString() { StringBuilder sb = new StringBuilder(); sb.append(getClass().getSimpleName()); sb.append(" ["); sb.append("Hash = ").append(hashCode()); sb.append(", name=").append(name); sb.append(", type=").append(type); sb.append(", sequence=").append(sequence); sb.append(", level=").append(level); sb.append(", longitude=").append(longitude); sb.append(", latitude=").append(latitude); sb.append(", parentId=").append(parentId); sb.append(", parentId1=").append(parentId1); sb.append(", parentId2=").append(parentId2); sb.append(", parentId3=").append(parentId3); sb.append(", parentId4=").append(parentId4); sb.append(", parentId5=").append(parentId5); sb.append(", parentId6=").append(parentId6); sb.append(", parentId7=").append(parentId7); sb.append(", parentId8=").append(parentId8); sb.append(", parentId9=").append(parentId9); sb.append(", parentId10=").append(parentId10); sb.append("]"); return sb.toString(); } }
<filename>photogram/views.py from django.shortcuts import render,redirect from django.contrib.auth.decorators import login_required from .models import Profile,Photo,comments,Following from .forms import NewPhotoForm,EditProfileForm from django.contrib.auth.models import User # Create your views here. @login_required(login_url='/accounts/login/') def index(request): photos = Photo.objects.all() all_comments = comments.objects.all() return render(request,'index.html',{"photos":photos,"all_comments":all_comments}) @login_required(login_url='/accounts/login/') def profile(request): current_user = request.user try: profile = Profile.objects.filter(user = current_user).first() except: return redirect('edit-profile') photos = Photo.objects.filter(user = current_user).all() posts = photos.count() followings = Following.objects.filter(profile = profile).all() following = followings.count() all_followers = Following.objects.filter(user = current_user).all() followers = all_followers.count() return render(request,'profile.html',{"profile":profile,"photos":photos,"current_user":current_user,"posts":posts,"following":following,"followers":followers}) @login_required(login_url='/accounts/login/') def edit_profile(request): current_user = request.user if request.method == 'POST': form = EditProfileForm(request.POST, request.FILES) if form.is_valid(): profile = form.save(commit=False) profile.user = current_user profile.save() return redirect('profile') else: form = EditProfileForm() return render(request, 'edit_profile.html', {"form": form}) @login_required(login_url='/accounts/login/') def new_photo(request): current_user = request.user if request.method == 'POST': form = NewPhotoForm(request.POST, request.FILES) if form.is_valid(): photo = form.save(commit=False) photo.user = current_user photo.save() return redirect('profile') else: form = NewPhotoForm() return render(request, 'new_photo.html', {"form": form}) @login_required(login_url='/accounts/login/') def comment(request,photo_id): if 'comment' in request.GET and request.GET["comment"]: comment = request.GET.get("comment") new_comment = comments(comment = comment) new_comment.user = request.user comment_photo = Photo.objects.filter(id= photo_id).first() new_comment.photo = comment_photo new_comment.save() return redirect('index') @login_required(login_url='/accounts/login/') def like_photo(request,photo_id): photo = Photo.objects.filter(id = photo_id).first() photo.likes +=1 photo.save() return redirect('index') @login_required(login_url='/accounts/login/') def follow_user(request,user_id): current_user = request.user profile = Profile.objects.filter(user = current_user).first() user = User.objects.filter(id=user_id).first() follow = Following(profile = profile,user = user) follow.save() return redirect('people') @login_required(login_url='/accounts/login/') def people(request): profiles = Profile.objects.all() return render(request,'people.html',{"profiles":profiles})
Effectiveness of Building-Wide Integrated Pest Management Programs for German Cockroach and Bed Bug in a High-Rise Apartment Building Bed bug, Cimex lectularius (L.) (Hemiptera: Cimicidae), and German cockroach, Blattella germanica (L.) (Blattodea: Ectobiidae), infestations are commonly found in low-income housing communities and result in negative health effects and economic burden. Integrated Pest Management (IPM) has been shown to be an effective approach for managing these pests, yet practice of IPM in housing communities is very limited. We evaluated the effectiveness of a contractor-led bed bug IPM program and researcher-led cockroach IPM program in a high-rise apartment building for 1 yr. A second apartment building that received conventional monthly pest control service was used as control. The bed bug infestation rate decreased from 9% at 0 mo to 3% at 12 mo (63% reduction), even though the contractor only partially followed the IPM protocol; the German cockroach infestation rate decreased from 49% at 0 mo to 12% at 12 mo (75% reduction). In the control building, no monitors were installed in the infested apartments and the apartments received cursory treatment services from an existing pest control contractor. The bed bug infestation rate increased from 6% at 0 mo to 12% at 12 mo (117% increase); the German cockroach infestation rate decreased from 47% at 0 mo to 29% at 12 mo (39% reduction). IPM is a much more effective approach for building-wide control of cockroaches and bed bugs than conventional pest control service. This study confirms the benefit of building-wide IPM on pest reduction and challenges existed for carrying out IPM programs in low-income communities.
Hotspot or Heatwave? Getting to Grips with Neutron Star Burst Oscillations Many accreting neutron stars, including two of the millisecond pulsars, exhibit high frequency oscillations during Type I X-ray bursts. The properties of the burst oscillations reflect the nature of the thermal asymmetry on the stellar surface. The mechanism that gives rise to the aspzetry, however, remains unclear: possibilities include a hotspot due to uneven fuel distribution, modes of oscillation in the surface layers of the neutron star, or vortices driven by the Coriolis force. I will review some of the latest theory and observations, and present the results of a recent study of variability in the burst oscillations of the millisecond pulsar 51814-338.
Refilling medications through an online patient portal: consistent improvements in adherence across racial/ethnic groups OBJECTIVE Online patient portals are being widely implemented; however, no studies have examined whether portals influence health behaviors or outcomes similarly across patient racial/ethnic subgroups. We evaluated longitudinal changes in statin adherence to determine whether racial/ethnic minorities initiating use of the online refill function in patient portals had similar changes over time compared with Whites. METHODS We examined a retrospective cohort of diabetes patients who were existing patient portal users. The primary exposure was initiating online refill use (either exclusively for all statin refills or occasionally for some refills), compared with using the portal for other tasks (eg, exchanging secure messages with providers). The primary outcome was change in statin adherence, measured as the percentage of time a patient was without a supply of statins. Adjusted generalized estimating equation models controlled for race/ethnicity as a primary interaction term. RESULTS Fifty-eight percent of patient portal users were white, and all racial/ethnic minority groups had poorer baseline statin adherence compared with Whites. In adjusted difference-in-difference models, statin adherence improved significantly over time among patients who exclusively refilled prescriptions online, even after comparing changes over time with other portal users (4% absolute decrease in percentage of time without medication). This improvement was statistically similar across all racial/ethnic groups. DISCUSSION Patient portals may encourage or improve key health behaviors, such as medication adherence, for engaged patients, but further research will likely be required to reduce underlying racial/ethnic differences in adherence. CONCLUSION In a well-controlled examination of diabetes patients' behavior when using a new online feature for their healthcare management, patient portals were linked to better medication adherence across all racial/ethnic groups.
<reponame>BandwidthOnDemand/nsi-pce /* * To change this license header, choose License Headers in Project Properties. * To change this template file, choose Tools | Templates * and open the template in the editor. */ package net.es.nsi.pce.pf.api.cons; /** * * @author hacksaw */ public class ObjectAttrConstraint extends AttrConstraint { private Object value; public Object getValue() { return value; } public void setValue(Object value) { this.value = value; } public <T extends Object> T getValue(Class<T> classOfT) { return classOfT.cast(value); } }
"""Chess game, for learning to grab images from a sprite sheet.""" import sys import pygame from settings import Settings class ChessGame: """Overall class to manage game assets and behavior.""" def __init__(self): """Initialize the game, and create resources.""" pygame.init() self.settings = Settings() self.screen = pygame.display.set_mode( (self.settings.screen_width, self.settings.screen_height)) pygame.display.set_caption("Chess") def run_game(self): """Start the main loop for the game.""" while True: self._check_events() self._update_screen() def _check_events(self): for event in pygame.event.get(): if event.type == pygame.QUIT: sys.exit() elif event.type == pygame.KEYDOWN: if event.key == pygame.K_q: sys.exit() def _update_screen(self): self.screen.fill(self.settings.bg_color) pygame.display.flip() if __name__ == '__main__': chess_game = ChessGame() chess_game.run_game()
Various belt type continuously variable transmissions have been proposed as transmissions for use in vehicles like automobiles. Generally, a conventional belt type continuously variable transmission comprises two pulleys and a belt. One of the pulleys, each having a circumferentially extending V-shaped groove, is provided on a rotational shaft and the other is provided on another shaft, and the belt having a V-shaped cross section is disposed around these pulleys for power transmission. While the power is being transmitted from one of the rotational shafts to the other, the widths of the circumferential grooves of these pulleys are controlled in an inverse proportion continuously to vary in adjusting the speed change ratio of the transmission. For example, a prior-art belt, which is used in such a belt type continuously variable transmission, comprises laminated endless flat rings and a plurality of elements, which are in contact one after another in succession along the rings, which support the elements slidably. FIG. 3 shows an example of element, which is used for constituting such a belt for use in a continuously variable transmission. The prior-art element 200 shown in FIG. 3, which is punched out from a metal plate (not shown), comprises a body portion 201, a head portion 202 and a neck portion 203 in a one-piece body. The body portion 201 is designed to come into contact with a pulley of a continuously variable transmission (not shown), and the head portion 202 is located above the body portion 201 with the neck portion 203 connecting the body portion 201 and the head portion 202. The neck portion 203 has a width narrower than those of the head and body portions 201 and 202. Furthermore, a pair of saddle portions 204 are provided symmetrically at the right and left upper parts of the body portion 201, and a pair of ear portions 205 are provided at the right and left sides of the head portion 202 that face the saddle portions 204, respectively. By this design of the element 200, a ring-accommodating space 210, which is to accommodate a ring (not shown), is defined on each side of the element by the upper end of the saddle portion 204, the side end of the neck portion 203 and the lower end of the ear portion 205. When rings are accommodated in the ring-accommodating spaces 210, the rings are placed on the saddle portions 204, respectively. Furthermore, a cylindrical nose portion 206 is provided extruding from one face of the head portion 202 while a cylindrical hole 207 is provided on the other face of the head portion 202, so that the nose portion 206 of one element 200 can be fitted into the hole 207 of another element 200. Moreover, a first recess 208 is provided at each upper innermost part of the saddle portions 204 in an arc figure connecting smoothly to a corresponding lower side end of the neck portion 203 while a second recess 209 is provided at each lower innermost part of the ear portions 205 in an arc figure connecting smoothly to a corresponding upper side end of the neck portions 203. In such an element exemplified by the above description, for avoiding stress concentration at the part where the upper innermost part of the saddle portion meets the lower side end of the neck portion, for example, Japanese Utility-Model Publication No. H05(1993)-14028 discloses a method in which the bottom of the recess (for example, the first recess 208 shown in FIG. 3) at the part where the upper innermost part of the saddle portion meets the lower side end of the neck portion is in a convex figure whose peak is approximately at the center in the thickness direction of the plate forming the element. By the way, a relatively large stress is generated in the part of the element where the lower innermost part of the ear portion meets the upper side end of the neck portion when each element comes into contact with the pulley or goes off from the pulley of a continuously variable transmission in operation. Therefore, for example, if the thickness of the element is increased to secure a sufficient strength for this part, then the efficiency of power transmission may decrease or a noise may arise. On the other hand, for example, if the above method (disclosed in Japanese Utility-Model Publication No. H05(1993)-14028) is applied also to the part where the lower innermost part of the ear portion meets the upper side end of the neck portion, then a grinding process must be added in the manufacturing work, and this process can increase the manufacturing cost of the element.
import { TestBed } from '@angular/core/testing'; import { Entrypoint } from './Entrypoint'; describe('Entrypoint', () => { const flat1 = { 'type': 'entrypoint', 'value': ['rm -rf /', 'foo'], 'container': 'b7e62be9-e87b-4d54-90c3-1a477b04014b', 'id': 'b6124aa0-4a61-4dbf-a5d1-4a3031f46f79' }; it('should handle round trip', () => { const a1 = Entrypoint.construct(Entrypoint.OBJECT_NAME) as Entrypoint; a1.fromFlat(flat1); expect(a1.value.length).toEqual(2); expect(a1.toFlat()).toEqual(flat1); }); it('should be empty if new', () => { const a1 = Entrypoint.construct(Entrypoint.OBJECT_NAME) as Entrypoint; expect(a1.isEmpty()).toBeTruthy(); }); });
Blurred boundaries: capturing and managing personal information in archival records in the digital era Over the last decade the role and responsibilities of archivists in managing personal information have shifted dramatically as record creation and capture has moved from paper to digital paradigms. Online collaborative tools have blurred the boundaries between personal and public spaces. In addition ownership is underpinned by a complex network of legislation which comes into play not only dependent upon where the record author sits but on the infrastructure of the software channels through which s/he generates and exchanges information. For example a record author sitting in Europe may generate records through a software company with headquarters in Iceland, hosted within a Cloud in India but with an intended audience in the USA. How then is this set of records passed to the archivist and who owns the records after transfer? This paper will discuss the challenges faced by archivists in acquiring, holding and negotiating access to personal information through time. The discussion is positioned from a UK/European standpoint which provides a particular lens for the work, as Europe has possibly the toughest personal data and privacy legislation in the world. The paper will seek to position this perspective within the context of wider international considerations.
/* * Copyright (c) 2014, 2018, Oracle and/or its affiliates. All rights reserved. * DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER. * * This code is free software; you can redistribute it and/or modify it * under the terms of the GNU General Public License version 2 only, as * published by the Free Software Foundation. * * This code is distributed in the hope that it will be useful, but WITHOUT * ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or * FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License * version 2 for more details (a copy is included in the LICENSE file that * accompanied this code). * * You should have received a copy of the GNU General Public License version * 2 along with this work; if not, write to the Free Software Foundation, * Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA. * * Please contact Oracle, 500 Oracle Parkway, Redwood Shores, CA 94065 USA * or visit www.oracle.com if you need additional information or have any * questions. */ package org.graalvm.compiler.lir.dfa; import static jdk.vm.ci.code.ValueUtil.asRegister; import static jdk.vm.ci.code.ValueUtil.isRegister; import static jdk.vm.ci.code.ValueUtil.isStackSlot; import org.graalvm.compiler.core.common.LIRKind; import org.graalvm.compiler.lir.LIR; import org.graalvm.compiler.lir.LIRFrameState; import org.graalvm.compiler.lir.LIRInstruction; import org.graalvm.compiler.lir.framemap.FrameMap; import org.graalvm.compiler.lir.framemap.ReferenceMapBuilder; import org.graalvm.compiler.lir.gen.LIRGenerationResult; import org.graalvm.compiler.lir.phases.AllocationPhase; import jdk.vm.ci.code.ReferenceMap; import jdk.vm.ci.code.Register; import jdk.vm.ci.code.RegisterAttributes; import jdk.vm.ci.code.TargetDescription; import jdk.vm.ci.meta.Value; /** * Mark all live references for a frame state. The frame state uses this information to build the * {@link ReferenceMap}s. */ public final class LocationMarkerPhase extends AllocationPhase { @Override protected void run(TargetDescription target, LIRGenerationResult lirGenRes, AllocationContext context) { new Marker(lirGenRes.getLIR(), lirGenRes.getFrameMap()).build(); } static final class Marker extends LocationMarker<RegStackValueSet> { private final RegisterAttributes[] registerAttributes; private Marker(LIR lir, FrameMap frameMap) { super(lir, frameMap); this.registerAttributes = frameMap.getRegisterConfig().getAttributesMap(); } @Override protected RegStackValueSet newLiveValueSet() { return new RegStackValueSet(frameMap); } @Override protected boolean shouldProcessValue(Value operand) { if (isRegister(operand)) { Register reg = asRegister(operand); if (!reg.mayContainReference() || !attributes(reg).isAllocatable()) { // register that's not allocatable or not part of the reference map return false; } } else if (!isStackSlot(operand)) { // neither register nor stack slot return false; } return !operand.getValueKind().equals(LIRKind.Illegal); } /** * This method does the actual marking. */ @Override protected void processState(LIRInstruction op, LIRFrameState info, RegStackValueSet values) { if (!info.hasDebugInfo()) { info.initDebugInfo(frameMap, !op.destroysCallerSavedRegisters() || !frameMap.getRegisterConfig().areAllAllocatableRegistersCallerSaved()); } ReferenceMapBuilder refMap = frameMap.newReferenceMapBuilder(); values.addLiveValues(refMap); info.debugInfo().setReferenceMap(refMap.finish(info)); } /** * Gets an object describing the attributes of a given register according to this register * configuration. */ private RegisterAttributes attributes(Register reg) { return registerAttributes[reg.number]; } } }
Circularly polarized antenna based on mu-negative transmission line A circularly polarized (CP) antenna based on mu-negative (MNG) transmission lines (TLs) is presented. The antenna consists of two pairs of resonators for two orthogonal polarizations. The 90° phase difference between them is achieved by detuning the corresponding resonance frequencies. The simulation results show that the 10 dB bandwidth (BW) of 29 MHz and 3 dB axial ratio (AR) BW of 7 MHz are obtained at 1.5 GHz. Also, the right-hand (RH) CP gain is 2.22 dBic in broadside direction and the minimum 3 dB AR beamwidth of 116° is simulated at = 60°. The results indicate that the antenna is suitable for the applications demanding for wide coverage area such as global position system (GPS) and wireless local area network (WLAN).
/** * Details about any caps (maximum charges) that apply to a particular fee/charge */ @ApiModel(description = "Details about any caps (maximum charges) that apply to a particular fee/charge") @Data @JsonIgnoreProperties(ignoreUnknown = true) public class OverdraftOverdraftFeeChargeCap { /** * Gets or Sets cappingPeriod */ public enum CappingPeriodEnum { DAY("Day"), HALF_YEAR("Half Year"), MONTH("Month"), QUARTER("Quarter"), WEEK("Week"), ACADEMICTERM("AcademicTerm"), YEAR("Year"); private String value; CappingPeriodEnum(String value) { this.value = value; } @JsonValue public String getValue() { return value; } @Override public String toString() { return String.valueOf(value); } @JsonCreator public static CappingPeriodEnum fromValue(String text) { for (CappingPeriodEnum b : CappingPeriodEnum.values()) { if (String.valueOf(b.value).equals(text)) { return b; } } return null; } } @JsonProperty("CappingPeriod") private CappingPeriodEnum cappingPeriod = null; @JsonProperty("FeeCapAmount") private String feeCapAmount = null; @JsonProperty("FeeCapOccurrence") private Float feeCapOccurrence = null; /** * Gets or Sets feeType */ public enum FeeTypeEnum { ARRANGEDOVERDRAFT("ArrangedOverdraft"), EMERGENCYBORROWING("EmergencyBorrowing"), BORROWINGITEM("BorrowingItem"), OVERDRAFTRENEWAL("OverdraftRenewal"), ANNUALREVIEW("AnnualReview"), OVERDRAFTSETUP("OverdraftSetup"), SURCHARGE("Surcharge"), TEMPOVERDRAFT("TempOverdraft"), UNAUTHORISEDBORROWING("UnauthorisedBorrowing"), UNAUTHORISEDPAIDTRANS("UnauthorisedPaidTrans"), OTHER("Other"), UNAUTHORISEDUNPAIDTRANS("UnauthorisedUnpaidTrans"); private String value; FeeTypeEnum(String value) { this.value = value; } @JsonValue public String getValue() { return value; } @Override public String toString() { return String.valueOf(value); } @JsonCreator public static FeeTypeEnum fromValue(String text) { for (FeeTypeEnum b : FeeTypeEnum.values()) { if (String.valueOf(b.value).equals(text)) { return b; } } return null; } } @JsonProperty("FeeType") private List<FeeTypeEnum> feeType = null; /** * Gets or Sets minMaxType */ public enum MinMaxTypeEnum { MINIMUM("Minimum"), MAXIMUM("Maximum"); private String value; MinMaxTypeEnum(String value) { this.value = value; } @JsonValue public String getValue() { return value; } @Override public String toString() { return String.valueOf(value); } @JsonCreator public static MinMaxTypeEnum fromValue(String text) { for (MinMaxTypeEnum b : MinMaxTypeEnum.values()) { if (String.valueOf(b.value).equals(text)) { return b; } } return null; } } @JsonProperty("MinMaxType") private MinMaxTypeEnum minMaxType = null; @JsonProperty("Notes") private List<String> notes = null; @JsonProperty("OtherFeeType") private List<OverdraftOtherFeeType> otherFeeType = null; @JsonProperty("OverdraftControlIndicator") private Boolean overdraftControlIndicator = null; }
<reponame>CharaD7/grpc<filename>src/python/grpcio/grpc/_links/service.py # Copyright 2015, Google Inc. # All rights reserved. # # Redistribution and use in source and binary forms, with or without # modification, are permitted provided that the following conditions are # met: # # * Redistributions of source code must retain the above copyright # notice, this list of conditions and the following disclaimer. # * Redistributions in binary form must reproduce the above # copyright notice, this list of conditions and the following disclaimer # in the documentation and/or other materials provided with the # distribution. # * Neither the name of Google Inc. nor the names of its # contributors may be used to endorse or promote products derived from # this software without specific prior written permission. # # THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS # "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT # LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR # A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT # OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, # SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT # LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, # DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY # THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT # (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE # OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. """The RPC-service-side bridge between RPC Framework and GRPC-on-the-wire.""" import abc import enum import logging import threading import time from grpc._adapter import _intermediary_low from grpc._links import _constants from grpc.beta import interfaces as beta_interfaces from grpc.framework.foundation import logging_pool from grpc.framework.foundation import relay from grpc.framework.interfaces.links import links _IDENTITY = lambda x: x _TERMINATION_KIND_TO_CODE = { links.Ticket.Termination.COMPLETION: _intermediary_low.Code.OK, links.Ticket.Termination.CANCELLATION: _intermediary_low.Code.CANCELLED, links.Ticket.Termination.EXPIRATION: _intermediary_low.Code.DEADLINE_EXCEEDED, links.Ticket.Termination.SHUTDOWN: _intermediary_low.Code.UNAVAILABLE, links.Ticket.Termination.RECEPTION_FAILURE: _intermediary_low.Code.INTERNAL, links.Ticket.Termination.TRANSMISSION_FAILURE: _intermediary_low.Code.INTERNAL, links.Ticket.Termination.LOCAL_FAILURE: _intermediary_low.Code.UNKNOWN, links.Ticket.Termination.REMOTE_FAILURE: _intermediary_low.Code.UNKNOWN, } _STOP = _intermediary_low.Event.Kind.STOP _WRITE = _intermediary_low.Event.Kind.WRITE_ACCEPTED _COMPLETE = _intermediary_low.Event.Kind.COMPLETE_ACCEPTED _SERVICE = _intermediary_low.Event.Kind.SERVICE_ACCEPTED _READ = _intermediary_low.Event.Kind.READ_ACCEPTED _FINISH = _intermediary_low.Event.Kind.FINISH @enum.unique class _Read(enum.Enum): READING = 'reading' # TODO(issue 2916): This state will again be necessary after eliminating the # "early_read" field of _RPCState and going back to only reading when granted # allowance to read. # AWAITING_ALLOWANCE = 'awaiting allowance' CLOSED = 'closed' @enum.unique class _HighWrite(enum.Enum): OPEN = 'open' CLOSED = 'closed' @enum.unique class _LowWrite(enum.Enum): """The possible categories of low-level write state.""" OPEN = 'OPEN' ACTIVE = 'ACTIVE' CLOSED = 'CLOSED' class _Context(beta_interfaces.GRPCServicerContext): def __init__(self, call): self._lock = threading.Lock() self._call = call self._disable_next_compression = False def peer(self): with self._lock: return self._call.peer() def disable_next_response_compression(self): with self._lock: self._disable_next_compression = True def next_compression_disabled(self): with self._lock: disabled = self._disable_next_compression self._disable_next_compression = False return disabled class _RPCState(object): def __init__( self, request_deserializer, response_serializer, sequence_number, read, early_read, allowance, high_write, low_write, premetadataed, terminal_metadata, code, message, due, context): self.request_deserializer = request_deserializer self.response_serializer = response_serializer self.sequence_number = sequence_number self.read = read # TODO(issue 2916): Eliminate this by eliminating the necessity of calling # call.read just to advance the RPC. self.early_read = early_read # A raw (not deserialized) read. self.allowance = allowance self.high_write = high_write self.low_write = low_write self.premetadataed = premetadataed self.terminal_metadata = terminal_metadata self.code = code self.message = message self.due = due self.context = context def _no_longer_due(kind, rpc_state, key, rpc_states): rpc_state.due.remove(kind) if not rpc_state.due: del rpc_states[key] def _metadatafy(call, metadata): for metadata_key, metadata_value in metadata: call.add_metadata(metadata_key, metadata_value) def _status(termination_kind, high_code, details): low_details = b'' if details is None else details if high_code is None: low_code = _TERMINATION_KIND_TO_CODE[termination_kind] else: low_code = _constants.HIGH_STATUS_CODE_TO_LOW_STATUS_CODE[high_code] return _intermediary_low.Status(low_code, low_details) class _Kernel(object): def __init__(self, request_deserializers, response_serializers, ticket_relay): self._lock = threading.Lock() self._request_deserializers = request_deserializers self._response_serializers = response_serializers self._relay = ticket_relay self._completion_queue = None self._due = set() self._server = None self._rpc_states = {} self._pool = None def _on_service_acceptance_event(self, event, server): server.service(None) service_acceptance = event.service_acceptance call = service_acceptance.call call.accept(self._completion_queue, call) try: group, method = service_acceptance.method.split('/')[1:3] except ValueError: logging.info('Illegal path "%s"!', service_acceptance.method) return request_deserializer = self._request_deserializers.get( (group, method), _IDENTITY) response_serializer = self._response_serializers.get( (group, method), _IDENTITY) call.read(call) context = _Context(call) self._rpc_states[call] = _RPCState( request_deserializer, response_serializer, 1, _Read.READING, None, 1, _HighWrite.OPEN, _LowWrite.OPEN, False, None, None, None, set((_READ, _FINISH,)), context) protocol = links.Protocol(links.Protocol.Kind.SERVICER_CONTEXT, context) ticket = links.Ticket( call, 0, group, method, links.Ticket.Subscription.FULL, service_acceptance.deadline - time.time(), None, event.metadata, None, None, None, None, None, protocol) self._relay.add_value(ticket) def _on_read_event(self, event): call = event.tag rpc_state = self._rpc_states[call] if event.bytes is None: rpc_state.read = _Read.CLOSED payload = None termination = links.Ticket.Termination.COMPLETION _no_longer_due(_READ, rpc_state, call, self._rpc_states) else: if 0 < rpc_state.allowance: payload = rpc_state.request_deserializer(event.bytes) termination = None rpc_state.allowance -= 1 call.read(call) else: rpc_state.early_read = event.bytes _no_longer_due(_READ, rpc_state, call, self._rpc_states) return # TODO(issue 2916): Instead of returning: # rpc_state.read = _Read.AWAITING_ALLOWANCE ticket = links.Ticket( call, rpc_state.sequence_number, None, None, None, None, None, None, payload, None, None, None, termination, None) rpc_state.sequence_number += 1 self._relay.add_value(ticket) def _on_write_event(self, event): call = event.tag rpc_state = self._rpc_states[call] if rpc_state.high_write is _HighWrite.CLOSED: if rpc_state.terminal_metadata is not None: _metadatafy(call, rpc_state.terminal_metadata) status = _status( links.Ticket.Termination.COMPLETION, rpc_state.code, rpc_state.message) call.status(status, call) rpc_state.low_write = _LowWrite.CLOSED rpc_state.due.add(_COMPLETE) rpc_state.due.remove(_WRITE) else: ticket = links.Ticket( call, rpc_state.sequence_number, None, None, None, None, 1, None, None, None, None, None, None, None) rpc_state.sequence_number += 1 self._relay.add_value(ticket) rpc_state.low_write = _LowWrite.OPEN _no_longer_due(_WRITE, rpc_state, call, self._rpc_states) def _on_finish_event(self, event): call = event.tag rpc_state = self._rpc_states[call] _no_longer_due(_FINISH, rpc_state, call, self._rpc_states) code = event.status.code if code == _intermediary_low.Code.OK: return if code == _intermediary_low.Code.CANCELLED: termination = links.Ticket.Termination.CANCELLATION elif code == _intermediary_low.Code.DEADLINE_EXCEEDED: termination = links.Ticket.Termination.EXPIRATION else: termination = links.Ticket.Termination.TRANSMISSION_FAILURE ticket = links.Ticket( call, rpc_state.sequence_number, None, None, None, None, None, None, None, None, None, None, termination, None) rpc_state.sequence_number += 1 self._relay.add_value(ticket) def _spin(self, completion_queue, server): while True: event = completion_queue.get(None) with self._lock: if event.kind is _STOP: self._due.remove(_STOP) elif event.kind is _READ: self._on_read_event(event) elif event.kind is _WRITE: self._on_write_event(event) elif event.kind is _COMPLETE: _no_longer_due( _COMPLETE, self._rpc_states.get(event.tag), event.tag, self._rpc_states) elif event.kind is _intermediary_low.Event.Kind.FINISH: self._on_finish_event(event) elif event.kind is _SERVICE: if self._server is None: self._due.remove(_SERVICE) else: self._on_service_acceptance_event(event, server) else: logging.error('Illegal event! %s', (event,)) if not self._due and not self._rpc_states: completion_queue.stop() return def add_ticket(self, ticket): with self._lock: call = ticket.operation_id rpc_state = self._rpc_states.get(call) if rpc_state is None: return if ticket.initial_metadata is not None: _metadatafy(call, ticket.initial_metadata) call.premetadata() rpc_state.premetadataed = True elif not rpc_state.premetadataed: if (ticket.terminal_metadata is not None or ticket.payload is not None or ticket.termination is not None or ticket.code is not None or ticket.message is not None): call.premetadata() rpc_state.premetadataed = True if ticket.allowance is not None: if rpc_state.early_read is None: rpc_state.allowance += ticket.allowance else: payload = rpc_state.request_deserializer(rpc_state.early_read) rpc_state.allowance += ticket.allowance - 1 rpc_state.early_read = None if rpc_state.read is _Read.READING: call.read(call) rpc_state.due.add(_READ) termination = None else: termination = links.Ticket.Termination.COMPLETION early_read_ticket = links.Ticket( call, rpc_state.sequence_number, None, None, None, None, None, None, payload, None, None, None, termination, None) rpc_state.sequence_number += 1 self._relay.add_value(early_read_ticket) if ticket.payload is not None: disable_compression = rpc_state.context.next_compression_disabled() if disable_compression: flags = _intermediary_low.WriteFlags.WRITE_NO_COMPRESS else: flags = 0 call.write(rpc_state.response_serializer(ticket.payload), call, flags) rpc_state.due.add(_WRITE) rpc_state.low_write = _LowWrite.ACTIVE if ticket.terminal_metadata is not None: rpc_state.terminal_metadata = ticket.terminal_metadata if ticket.code is not None: rpc_state.code = ticket.code if ticket.message is not None: rpc_state.message = ticket.message if ticket.termination is links.Ticket.Termination.COMPLETION: rpc_state.high_write = _HighWrite.CLOSED if rpc_state.low_write is _LowWrite.OPEN: if rpc_state.terminal_metadata is not None: _metadatafy(call, rpc_state.terminal_metadata) status = _status( links.Ticket.Termination.COMPLETION, rpc_state.code, rpc_state.message) call.status(status, call) rpc_state.due.add(_COMPLETE) rpc_state.low_write = _LowWrite.CLOSED elif ticket.termination is not None: if rpc_state.terminal_metadata is not None: _metadatafy(call, rpc_state.terminal_metadata) status = _status( ticket.termination, rpc_state.code, rpc_state.message) call.status(status, call) rpc_state.due.add(_COMPLETE) def add_port(self, address, server_credentials): with self._lock: if self._server is None: self._completion_queue = _intermediary_low.CompletionQueue() self._server = _intermediary_low.Server(self._completion_queue) if server_credentials is None: return self._server.add_http2_addr(address) else: return self._server.add_secure_http2_addr(address, server_credentials) def start(self): with self._lock: if self._server is None: self._completion_queue = _intermediary_low.CompletionQueue() self._server = _intermediary_low.Server(self._completion_queue) self._pool = logging_pool.pool(1) self._pool.submit(self._spin, self._completion_queue, self._server) self._server.start() self._server.service(None) self._due.add(_SERVICE) def begin_stop(self): with self._lock: self._server.stop() self._due.add(_STOP) self._server = None def end_stop(self): with self._lock: pool = self._pool pool.shutdown(wait=True) class ServiceLink(links.Link): """A links.Link for use on the service-side of a gRPC connection. Implementations of this interface are only valid for use between calls to their start method and one of their stop methods. """ @abc.abstractmethod def add_port(self, address, server_credentials): """Adds a port on which to service RPCs after this link has been started. Args: address: The address on which to service RPCs with a port number of zero requesting that a port number be automatically selected and used. server_credentials: An _intermediary_low.ServerCredentials object, or None for insecure service. Returns: An integer port on which RPCs will be serviced after this link has been started. This is typically the same number as the port number contained in the passed address, but will likely be different if the port number contained in the passed address was zero. """ raise NotImplementedError() @abc.abstractmethod def start(self): """Starts this object. This method must be called before attempting to use this Link in ticket exchange. """ raise NotImplementedError() @abc.abstractmethod def begin_stop(self): """Indicate imminent link stop and immediate rejection of new RPCs. New RPCs will be rejected as soon as this method is called, but ongoing RPCs will be allowed to continue until they terminate. This method does not block. """ raise NotImplementedError() @abc.abstractmethod def end_stop(self): """Finishes stopping this link. begin_stop must have been called exactly once before calling this method. All in-progress RPCs will be terminated immediately. """ raise NotImplementedError() class _ServiceLink(ServiceLink): def __init__(self, request_deserializers, response_serializers): self._relay = relay.relay(None) self._kernel = _Kernel( {} if request_deserializers is None else request_deserializers, {} if response_serializers is None else response_serializers, self._relay) def accept_ticket(self, ticket): self._kernel.add_ticket(ticket) def join_link(self, link): self._relay.set_behavior(link.accept_ticket) def add_port(self, address, server_credentials): return self._kernel.add_port(address, server_credentials) def start(self): self._relay.start() return self._kernel.start() def begin_stop(self): self._kernel.begin_stop() def end_stop(self): self._kernel.end_stop() self._relay.stop() def service_link(request_deserializers, response_serializers): """Creates a ServiceLink. Args: request_deserializers: A dict from group-method pair to request object deserialization behavior. response_serializers: A dict from group-method pair to response ojbect serialization behavior. Returns: A ServiceLink. """ return _ServiceLink(request_deserializers, response_serializers)
Review and comparison of works on heterogeneous data and semantic analysis in Big Data In integration approaches, heterogeneity is one of the main challenging factors on the task of providing integration among different data sources, whose solution lies in the search for equality among them. This work describes the state of the art and theoretical foundation involved in the structural and semantic analysis of heterogeneous data and information. The work aims to review methods and techniques used in data integration in Big Data, considering data heterogeneity, reviewing techniques that use the concepts of Semantic Web, Cloud Computing, Data Analysis, Big Data, Data Warehouse and other technologies to solve the problem of data heterogeneity. The research was divided into three stages. In the first stage, articles were selected from digital libraries according to their titles and keywords. In the second stage, the papers went through a second filter based on their summary, and, besides that, duplicate articles were also removed. The works introduction and conclusion were analyzed in the third stage to select the articles belonging to this systematic review. Throughout the study, articles were analyzed, compared and categorized. At the end of each section, the interrelationships and possible areas for future work were shown. Introduction In building a Data Warehouse (DW), data transfer activities from the origin data source to its destination source are highly complicated and iterative, especially as new data sources are added. DW construction and data management approaches have been supported by integration tools through the Extract, Transformation, and Load (ETL) process (KIMBALL; ROSS, 2011). These tools are designed to handle large volumes of data and are not flexible for handling semi-structured or unstructured data. Another aspect that must be considered in the integration process is heterogeneity. In traditional approaches for integration, heterogeneity is one of the main factors that increase complexity to provide integration between different data sources. This factor occurs at two levels. Technical or structural level: heterogeneity refers to the technological differences between the various components of hardware, software, communica- Conflicts between names, which derive from different ways of modeling a real-world problem, as it is common to find data with the same semantic content, yet with different names (synonyms) and the presence of homonyms that denote the use of the same name to refer to different concepts; Conflicts between different Database (DB) schemes that use distinct structures to represent the same information; Conflicts between different DB schemes that use similar structures to represent diverse information; Conflicts of inclusion of entities and attributes due to generalization abstraction, which happens when an entity from one database is logically included in another entity from another DB; Data type conflicts due to aggregation abstraction, which happens when the domain or type of the attribute is different for semantically equivalent attributes; Composition conflicts occur due to the abstraction of aggregation, which happens when similar concepts are represented in a DB as aggregation and another not. Another fundamental problem in the integration process is semantic heterogeneity. Its resolution is essential to provide interoperability between multiple sources. Different conceptualizations and different DB schemes are typically used to represent such replicated data. According to Dong and Srivastava, some issues must be resolved to manage semantic heterogeneity, among them: Provide an integrated view of the overlapping data sets of multiple DB; Provide support on updating against an integrated view; Identify and specify the relationships between two or more instantiations of replicated data; and Keep replicated data in sync. The situation is entirely different in the Big Data environment, as traditional integration approaches prove to be inefficient as the problem is handled (DONG; SRIVAS-TAVA, 2013). Big Data basically derives from a large or huge data volume, defined as a situation in which the volume, speed, and variety of data exceed the storage or computing capacity for making accurate and timely decisions. The storage of these large volumes of data can be done by introducing data centers, where data manipulation is complex and must be considered carefully (CUZZOCREA; SONG; DAVIS, 2011;). According to Cuzzocrea, Song and Davis, Big Data refers to huge amounts of unstructured data produced by high-performance applications that fall into a wide and heterogeneous family of application scenarios, such as computing, scientific applications on social networks, egovernment applications, medical information systems, among others. The data stored in the underlying layer of all these application scenarios have some specific common characteristics, including: I. Large-scale data, which refers to the size and distribution of the data repositories; II. Scalability issues, which refer to the ability of applications to run large-scale data repositories on a large scale, to scale over the rapid growth in size and data inputs; III. Advanced support for the low-level ETL process, raw data with little structured information; IV. Easy and interpretable conception and development of the science of analysis on large volume repositories to obtain intelligence and extract valuable knowledge. According to Dong and Srivastava, integration in Big Data differs from traditional data integration, which includes virtual integration and materialized, several dimensions storage: Volume: not only each data source contains a huge volume of data, but also the number of data sources, even for a single domain, grows to tens of thousands; Speed: as a direct consequence of the rate at which data is being collected and continuously made available, many of the data sources are very dynamic; Variety: Data sources (even in the same domain) are highly heterogeneous, both at the schema level in relation to how they structure their data and at the instance level in terms of how they describe the same real-world entity, exhibiting considerable variety, even for substantially similar entities; Veracity: data sources (even in the same domain) have very different qualities, with significant differences in the coverage, accuracy, and freshness of data provided. For a variety of scheme levels, the work by Li et al. shows that many attributes from different sources in the stock market, totaling up 333 attributes, among them attributes with the same semantics, but with other names, that were manually matched resulting in 153 attributes, called global attributes. These attributes are attributed to the Zipf Law distribution, which says that only a tiny portion of attributes has high coverage, and most have low coverage. In fact, 21 attributes (13.7%) are provided by at least a third of the sources, and more than 86% of attributes are defined by less than 25% of the sources. Guo et al. show an experiment with business data on a single website that aggregated data obtained from other sources for the variety at the instance level. It considers only the name, telephone, and address attributes for two zip codes to present a solution for linking records in the presence of uniqueness restrictions and wrong values using a linking and merging process applying them globally. Thus, to integrate different data sources, it is necessary to solve the problems of heterogeneity, whose solution lies in the search for equality between various data sources. This work aims to present a systematic review of the subject's literature to show structural and semantic heterogeneity in the process of integrating different data sources in Big Data. Materials and methods The work reviews the methods used in data integration in Big Data, considering the heterogeneity of data, analyzing techniques that use the Semantic Web concepts, Cloud Computing, data analysis, Big Data, DW, and other technologies covering the period from 2011 to 2020. The bibliographical research was carried out in English and Portuguese. The vast majority of publications and conferences related to this subject are in English, so this language was included in search terms. However, as the need to research content on national scientific production is considered, Portuguese was included. The articles reviewed in this research were taken from the following digital libraries: IEEE publications -IEEExplore; Science Direct; ACM Digital Library; Google Scholar; Scopus. Research questions Q1. What initiatives were presented for the problem of structural heterogeneity in the process of integrating different data sources in Big Data? Q2. What initiatives were presented for the problem of semantic heterogeneity in the process of integrating different data sources in Big Data? Terms and synonyms used in this research Terms and synonyms used in this work are shown in Table 1. In Table 2, search expression 1, was created to select articles that involved data integration, heterogeneity, and semantics, focusing on algorithms and data science. Search expression 2, was created to select articles that involved data integration, heterogeneity, and focusing on structural conflicts and data science. Search expression 3, was created to select articles that involved data integration and that additionally focused on autonomous sources. Search expression 4, was created to select articles involving data integration, semantic and structural heterogeneity, and that also focused on data warehousing. Search words were run on respective search engines with adaptations for correct document retrieval. Search engines used only advanced search mode descriptors. Document selection and tool used The Mendeley Desktop tool was the reference manager used to manipulate publications retrieved by search engines. Mendeley identified repetitions, retrieved complete documents, and provided a field for notes on articles. Therefore, sources for systematic review were selected and evaluated according to the inclusion and exclusion criteria, dividing the research into three stages. In the first stage, articles were searched in the digital libraries mentioned above and selected based on their title and keywords. In the second stage, articles went through a second filter based on their summary, and in addition, duplicate articles were removed. In the third and last stage, the works' introduction and conclusion were analyzed to select articles for systematic review. Table 3 shows the number of references retrieved according to the search engine used. As seen in Table 4, after reading and analyzing the relevance of titles, keywords, and abstract to the proposal of this literature review, 1583 were discarded. As seen in Table 5, after the deletion process, 34 articles remained. Proposed criteria for article analysis Articles were classified to the following criteria: Big Data: informs if the term Big Data is used; Business Intelligence (BI): informs if the BI is used; Data integration: tells if the term data integration is used; Cloud computing: reports if the term cloud computing is used; Semantic heterogeneity: informs if the term semantic heterogeneity is used; Ontology: Informs if the term ontology is used. Based on these criteria, articles were then analyzed considering the following subsections: Resolution of semantic heterogeneity through data analysis; Resolution of heterogeneity in Big Data; Resolution of heterogeneity with a focus on BI; Resolution of heterogeneity using DW and Middleware; Resolution of heterogeneity using ontologies and Web Semantic technologies. Selected articles' analysis In this section, according to the criteria presented above, the articles will be grouped and analyzed considering the proposals presented for the resolution of structural and semantic heterogeneity. Resolution of heterogeneity using ontologies and technologies of the Web Semantics The work by Nugraheni, Akbar and Saptawati proposes a framework for developing a semantic DW that can handle incomplete data and data heterogeneity problems (format, syntax, and structure) through the use of ontologies. The framework also provides tools for dealing with an incomplete data source. The framework provides tools to transform instances of objects relevant to a class of external ontologies that generate the required data. A hybrid methodology identifies the multidimensional elements and the conceptual design of the multidimensional scheme. The approach created by Fathy, Gad and Badr aims to translate queries sent in SPARQL Protocol and RDF Query Language (SPARQL) to queries in NoSQL graphs. In the first step, an extension of RDB-to-RDF mapping language (xR2RML) mapping file that includes the triple maps of the semantic model in question for the NoSQL graph is used as an input, creating an intermediate representation between SPARQL and NoSQL graph queries. Then, a SPARQL query is sent, which will be translated into a NoSQL graph query based on the mapping created in the first step so that the generated query can then be executed in the NoSQL graph DB. The algorithm then rewrites the query to perform the search on the original data source. Each data source has a local ontology (Source Graph) with a wrapper responsible for performing queries on the sources. Each of these wrappers is related to a fragment of the global ontology. A semi-automatic algorithm was also created that updates the global ontology as the data source is changed. In Ostrowski et al., a risk management system in the supply chain in an automotive sector is presented with the objective of integrating data. It aims to combine data from automakers and suppliers of an automotive factory with meteorological data. It has used the Allegro Graph software that works with geospatial ontologies already on the Web, allowing the assignment of latitude and longitude to its suppliers and assemblers based on the street address. Data on a climate ontology was also added to that of the automakers/suppliers. This final ontology can predict weather events that can delay sales and manufacture parts consulted by SPARQL queries. For his future work, the author identifies several challenges that need to be solved. Problems such as incompatible data support real-time data flow, and incomplete data are some of the challenges. Ninmagadda and Dreher In the approach of a single ontology, the source schemes are made available through a shared global ontology, which offers a uniform interface for each user. In the approach of multiple ontologies as local sources, they are mapped to local ontologies, and the connection between them is semantically established. Finally, the hybrid ontology approach explores the combination of the first two. In this approach, the sources are described by local ontologies and it is performs the mapping with a shared global ontology. Another important point is about which architecture the proposal of each article was based on, for example, DW, Middleware, or other technology to access data sources with the purpose to integrate and make information available, and finally, additional comments are presented to the submitted proposal. Table 6 Intelligence technologies to automatically integrate large amounts of data from several sources, whether structured or unstructured. The proposed framework aims to evaluate the data based on its metadata to verify the similarity between data and its possible level of integration. Its structure is flexible, as it uses integration modules, easing maintenance, implementation, and integration of new data models if necessary. Considerations on the resolution of semantic heterogeneity through data analysis The discussed works present a process of integrating data sources seeking to address structural and semantic heterogeneity using layered frameworks or architectures, presenting a unified view for data queries. The works by Shared global ontology. An ontological model using a data translation system from the source to the ontological triple. Semantic integration. Source: The authors. The main differences between the works by Madkour, Aref and Basalamah and Pelekis, Theodoridis and Janssens, is that the first uses a location-based semantic system while the second forms a semantic cube by subject through linked data, which can be by topics Data. Creation of more complete industrial ontologies. Resolution of heterogeneity in Big Data Deb Nath, Hose and Pedersen Considerations on the resolution of semantic heterogeneity through Big Data In the work by Deb Nath, Hose and Pedersen, the data sources can be structured, semi-structured, RDF and SPARQL. All ETL phases make use of the ontology definition, and the database generated can be consulted by SPARQL. For Mountasser, Storey and Frikh, the unstructured sources, geographic location, and log files are also considered. In the work by Bortoli et al., the semantic ETL uses Open Refine and its storage in the HDFS system. Using the Hadoop ecosystem, Chen et al. present a proposal to integrate heterogeneous data sources to the DCS environment, use parallel computing power together with the MapReduce algorithm, and use a uniform data model specific to DCS. As can be seen in Big Data, according to the works presented, the use of Ontologies and the semantic ETL process are fundamental for making data available to users. Bondarev and Zakirov shows the technologies involved in building and using a DW for data integration. In addition to the process of integrating structured data, it made use of the Hadoop platform to integrate unstructured and semi-structured data. Shafiee, Barker and Rasekh aims to automate and improve a hydraulic data system, which receives information from the weather forecast database and data from various sensors. In their approach, the authors use parallel computing for data to be processed and stored. A data lake called "Water Data Lake" will store the data throughout the process. Hadoop-based technologies were used in its creation. The data analysis process, responsible for cleaning data, filling in necessary values, and filtering out irrelevant information, was automated by the authors. The analysis process is also done in a distributed way using the Apache Spark software, which is responsible for distributing the data transformation tasks in a computational cluster. After being processed, the data is then again stored in the "Water Data Lake". A Middleware layer performs search queries on the data lake as soon as newly processed data is available. This step is responsible for standardizing and validating the data and then sending it to the wrappers responsible for receiving data streaming, modeling it, and sending the results back to the data lake. The results can then be used for analysis and decision-making. Considerations on the resolution of semantic heterogeneity through business intelligence It is observed that the discussed works adopt different approaches, as shown in Table 9. It shows the different choices for building a database to support decisionmaking. The works Bondarev and Zakirov, Ghosh, Halder and Sen and Aufaure et al. use BI architecture, the last being in real-time. A limitation presented by the work by Bala, Boussaid and Alimazighi due to the parallelism of the ETL process is the response time that depends on the number of tasks and to improve the performance of the system it is suggested the addition of new nodes and task numbers. The work by Abbas et al. differs because it uses the cloud-based structure to provide access to information. According to Choi, Chan and Yue, it is observed that there was an evolution of BI systems for data collection in real-time with, for example, airline companies, automotive industries, due to the use of technology such as Radio-frequency iDentIFication (RDIF). Internet of Things (IoT) also generates data in real-time and can be applied by food supply chains, stock management, transportation, among others. However, they are subject to an intrinsic restriction created by the sensors' heterogeneous nature. With the design of a specific system to address heterogeneity, IoT can be integrated with BI systems. Resolution of heterogeneity using data Warehouses and data manually to the government, concluding that the old system has its limitations, among them, the fact that the information is in a data silo and is not integrated with each other. If any agency needs the data of another, it has to integrate itself according to its needs. This process is highly laborious and manual. The solution proposed is to create a unified system that integrates the data received from agencies, not integrating data every time information is needed. Conclusion This research aimed to analyze the problem of structural and semantic heterogeneity in the process of integrating different data sources in Big Data, as well as proposed solutions, based on the literature found from 2011 to 2020. We sought to answer the research question: What initiatives were presented for the problem of structural and semantic heterogeneity in integrating different data sources in Big Data? The sources used for this research were: IEEE, Sci- In conducting the systematic review, sources were selected and evaluated according to the inclusion and exclusion criteria defined in the review protocol, and 34 articles were selected and analyzed by this research. They were classified according to the aggregating attributes: Big Data, BI, Data integration, Cloud computing, Semantic heterogeneity, and Ontologies. Based on these aggregating attributes, the articles were distributed on the following topics: Data Analysis; Big data; BI; DW and Middleware; and Ontologies and technologies of the Web Semantic. In this systematic review, several data integration techniques could be analyzed. Some use ETL methods to extract data from its sources, treat and homogenize that data, and store it. In contrast, others create global ontologies from databases and then use mapping to translate queries sent to these ontologies to execute at the source of origin. In the scope of semantic heterogeneity, semantic ETL is noted to perform data extraction and treatment. When addressing structural heterogeneity content and the use of ontologies in the solution of integrations free of dependence on keywords or similarity metrics, techniques such as cloud computing processing and multi-layer data prototypes are used in the processing and management of Big Data. Some obstacles in Big Data still need to be overcome. Data sources can be incomplete and incompatible and, although reduced, there is still a need for manual work. However, as seen in this review, several researchers are managing to overcome some of these challenges using Web Semantics, parallel computing, and AI concepts. As expected, each article has its line of study. However, it is possible to note that even studies with different characteristics use similar solutions, which mainly involve the use of semantic ETL to treat the heterogeneity of the data, before it can be stored in the DW. It is also observed that it uses semantics to improve the quality and reliability of different data objects. This research revealed the growing use of algorithms in query processing between different data sources and logs from heterogeneous data sources. In addition, growing interest in the capacity for privacy, reliability, and OLAP in large volumes of information was noted. Future research may involve the points mentioned above and a need to develop frameworks and semantic cubes to create a unified structure to combine and manage data.
<reponame>SHIVJITH/Odoo_Machine_Test # -*- coding: utf-8 -*- # Part of Odoo. See LICENSE file for full copyright and licensing details. from odoo.addons.mrp.tests.common import TestMrpCommon from odoo.tests import Form from odoo.tests.common import SavepointCase class TestMrpProductionBackorder(TestMrpCommon): @classmethod def setUpClass(cls): super().setUpClass() cls.stock_location = cls.env.ref('stock.stock_location_stock') def setUp(self): super().setUp() warehouse_form = Form(self.env['stock.warehouse']) warehouse_form.name = 'Test Warehouse' warehouse_form.code = 'TWH' self.warehouse = warehouse_form.save() def test_no_tracking_1(self): """Create a MO for 4 product. Produce 4. The backorder button should not appear and hitting mark as done should not open the backorder wizard. The name of the MO should be MO/001. """ mo = self.generate_mo(qty_final=4)[0] mo_form = Form(mo) mo_form.qty_producing = 4 mo = mo_form.save() # No backorder is proposed self.assertTrue(mo.button_mark_done()) self.assertEqual(mo._get_quantity_to_backorder(), 0) self.assertTrue("-001" not in mo.name) def test_no_tracking_2(self): """Create a MO for 4 product. Produce 1. The backorder button should appear and hitting mark as done should open the backorder wizard. In the backorder wizard, choose to do the backorder. A new MO for 3 self.untracked_bom should be created. The sequence of the first MO should be MO/001-01, the sequence of the second MO should be MO/001-02. Check that all MO are reachable through the procurement group. """ production, _, _, product_to_use_1, _ = self.generate_mo(qty_final=4, qty_base_1=3) self.assertEqual(production.state, 'confirmed') self.assertEqual(production.reserve_visible, True) # Make some stock and reserve for product in production.move_raw_ids.product_id: self.env['stock.quant'].with_context(inventory_mode=True).create({ 'product_id': product.id, 'inventory_quantity': 100, 'location_id': production.location_src_id.id, }) production.action_assign() self.assertEqual(production.state, 'confirmed') self.assertEqual(production.reserve_visible, False) mo_form = Form(production) mo_form.qty_producing = 1 production = mo_form.save() action = production.button_mark_done() backorder = Form(self.env['mrp.production.backorder'].with_context(**action['context'])) backorder.save().action_backorder() # Two related MO to the procurement group self.assertEqual(len(production.procurement_group_id.mrp_production_ids), 2) # Check MO backorder mo_backorder = production.procurement_group_id.mrp_production_ids[-1] self.assertEqual(mo_backorder.product_id.id, production.product_id.id) self.assertEqual(mo_backorder.product_qty, 3) self.assertEqual(sum(mo_backorder.move_raw_ids.filtered(lambda m: m.product_id.id == product_to_use_1.id).mapped("product_uom_qty")), 9) self.assertEqual(mo_backorder.reserve_visible, False) # the reservation of the first MO should've been moved here def test_no_tracking_pbm_1(self): """Create a MO for 4 product. Produce 1. The backorder button should appear and hitting mark as done should open the backorder wizard. In the backorder wizard, choose to do the backorder. A new MO for 3 self.untracked_bom should be created. The sequence of the first MO should be MO/001-01, the sequence of the second MO should be MO/001-02. Check that all MO are reachable through the procurement group. """ with Form(self.warehouse) as warehouse: warehouse.manufacture_steps = 'pbm' production, _, product_to_build, product_to_use_1, product_to_use_2 = self.generate_mo(qty_base_1=4, qty_final=4, picking_type_id=self.warehouse.manu_type_id) move_raw_ids = production.move_raw_ids self.assertEqual(len(move_raw_ids), 2) self.assertEqual(set(move_raw_ids.mapped("product_id")), {product_to_use_1, product_to_use_2}) pbm_move = move_raw_ids.move_orig_ids self.assertEqual(len(pbm_move), 2) self.assertEqual(set(pbm_move.mapped("product_id")), {product_to_use_1, product_to_use_2}) self.assertFalse(pbm_move.move_orig_ids) mo_form = Form(production) mo_form.qty_producing = 1 production = mo_form.save() self.assertEqual(sum(pbm_move.filtered(lambda m: m.product_id.id == product_to_use_1.id).mapped("product_qty")), 16) self.assertEqual(sum(pbm_move.filtered(lambda m: m.product_id.id == product_to_use_2.id).mapped("product_qty")), 4) action = production.button_mark_done() backorder = Form(self.env['mrp.production.backorder'].with_context(**action['context'])) backorder.save().action_backorder() mo_backorder = production.procurement_group_id.mrp_production_ids[-1] self.assertEqual(mo_backorder.delivery_count, 1) pbm_move |= mo_backorder.move_raw_ids.move_orig_ids # Check that quantity is correct self.assertEqual(sum(pbm_move.filtered(lambda m: m.product_id.id == product_to_use_1.id).mapped("product_qty")), 16) self.assertEqual(sum(pbm_move.filtered(lambda m: m.product_id.id == product_to_use_2.id).mapped("product_qty")), 4) self.assertFalse(pbm_move.move_orig_ids) def test_no_tracking_pbm_sam_1(self): """Create a MO for 4 product. Produce 1. The backorder button should appear and hitting mark as done should open the backorder wizard. In the backorder wizard, choose to do the backorder. A new MO for 3 self.untracked_bom should be created. The sequence of the first MO should be MO/001-01, the sequence of the second MO should be MO/001-02. Check that all MO are reachable through the procurement group. """ with Form(self.warehouse) as warehouse: warehouse.manufacture_steps = 'pbm_sam' production, _, product_to_build, product_to_use_1, product_to_use_2 = self.generate_mo(qty_base_1=4, qty_final=4, picking_type_id=self.warehouse.manu_type_id) move_raw_ids = production.move_raw_ids self.assertEqual(len(move_raw_ids), 2) self.assertEqual(set(move_raw_ids.mapped("product_id")), {product_to_use_1, product_to_use_2}) pbm_move = move_raw_ids.move_orig_ids self.assertEqual(len(pbm_move), 2) self.assertEqual(set(pbm_move.mapped("product_id")), {product_to_use_1, product_to_use_2}) self.assertFalse(pbm_move.move_orig_ids) self.assertEqual(sum(pbm_move.filtered(lambda m: m.product_id.id == product_to_use_1.id).mapped("product_qty")), 16) self.assertEqual(sum(pbm_move.filtered(lambda m: m.product_id.id == product_to_use_2.id).mapped("product_qty")), 4) sam_move = production.move_finished_ids.move_dest_ids self.assertEqual(len(sam_move), 1) self.assertEqual(sam_move.product_id.id, product_to_build.id) self.assertEqual(sum(sam_move.mapped("product_qty")), 4) mo_form = Form(production) mo_form.qty_producing = 1 production = mo_form.save() action = production.button_mark_done() backorder = Form(self.env['mrp.production.backorder'].with_context(**action['context'])) backorder.save().action_backorder() mo_backorder = production.procurement_group_id.mrp_production_ids[-1] self.assertEqual(mo_backorder.delivery_count, 2) pbm_move |= mo_backorder.move_raw_ids.move_orig_ids self.assertEqual(sum(pbm_move.filtered(lambda m: m.product_id.id == product_to_use_1.id).mapped("product_qty")), 16) self.assertEqual(sum(pbm_move.filtered(lambda m: m.product_id.id == product_to_use_2.id).mapped("product_qty")), 4) sam_move |= mo_backorder.move_finished_ids.move_orig_ids self.assertEqual(sum(sam_move.mapped("product_qty")), 4) def test_tracking_backorder_series_lot_1(self): """ Create a MO of 4 tracked products. all component is tracked by lots Produce one by one with one bakorder for each until end. """ nb_product_todo = 4 production, _, p_final, p1, p2 = self.generate_mo(qty_final=nb_product_todo, tracking_final='lot', tracking_base_1='lot', tracking_base_2='lot') lot_final = self.env['stock.production.lot'].create({ 'name': 'lot_final', 'product_id': p_final.id, 'company_id': self.env.company.id, }) lot_1 = self.env['stock.production.lot'].create({ 'name': 'lot_consumed_1', 'product_id': p1.id, 'company_id': self.env.company.id, }) lot_2 = self.env['stock.production.lot'].create({ 'name': 'lot_consumed_2', 'product_id': p2.id, 'company_id': self.env.company.id, }) self.env['stock.quant']._update_available_quantity(p1, self.stock_location, nb_product_todo*4, lot_id=lot_1) self.env['stock.quant']._update_available_quantity(p2, self.stock_location, nb_product_todo, lot_id=lot_2) production.action_assign() active_production = production for i in range(nb_product_todo): details_operation_form = Form(active_production.move_raw_ids.filtered(lambda m: m.product_id == p1), view=self.env.ref('stock.view_stock_move_operations')) with details_operation_form.move_line_ids.edit(0) as ml: ml.qty_done = 4 ml.lot_id = lot_1 details_operation_form.save() details_operation_form = Form(active_production.move_raw_ids.filtered(lambda m: m.product_id == p2), view=self.env.ref('stock.view_stock_move_operations')) with details_operation_form.move_line_ids.edit(0) as ml: ml.qty_done = 1 ml.lot_id = lot_2 details_operation_form.save() production_form = Form(active_production) production_form.qty_producing = 1 production_form.lot_producing_id = lot_final active_production = production_form.save() active_production.button_mark_done() if i + 1 != nb_product_todo: # If last MO, don't make a backorder action = active_production.button_mark_done() backorder = Form(self.env['mrp.production.backorder'].with_context(**action['context'])) backorder.save().action_backorder() active_production = active_production.procurement_group_id.mrp_production_ids[-1] self.assertEqual(self.env['stock.quant']._get_available_quantity(p_final, self.stock_location, lot_id=lot_final), nb_product_todo, f'You should have the {nb_product_todo} final product in stock') self.assertEqual(len(production.procurement_group_id.mrp_production_ids), nb_product_todo) def test_tracking_backorder_series_serial_1(self): """ Create a MO of 4 tracked products (serial) with pbm_sam. all component is tracked by serial Produce one by one with one bakorder for each until end. """ nb_product_todo = 4 production, _, p_final, p1, p2 = self.generate_mo(qty_final=nb_product_todo, tracking_final='serial', tracking_base_1='serial', tracking_base_2='serial', qty_base_1=1) serials_final, serials_p1, serials_p2 = [], [], [] for i in range(nb_product_todo): serials_final.append(self.env['stock.production.lot'].create({ 'name': f'lot_final_{i}', 'product_id': p_final.id, 'company_id': self.env.company.id, })) serials_p1.append(self.env['stock.production.lot'].create({ 'name': f'lot_consumed_1_{i}', 'product_id': p1.id, 'company_id': self.env.company.id, })) serials_p2.append(self.env['stock.production.lot'].create({ 'name': f'lot_consumed_2_{i}', 'product_id': p2.id, 'company_id': self.env.company.id, })) self.env['stock.quant']._update_available_quantity(p1, self.stock_location, 1, lot_id=serials_p1[-1]) self.env['stock.quant']._update_available_quantity(p2, self.stock_location, 1, lot_id=serials_p2[-1]) production.action_assign() active_production = production for i in range(nb_product_todo): details_operation_form = Form(active_production.move_raw_ids.filtered(lambda m: m.product_id == p1), view=self.env.ref('stock.view_stock_move_operations')) with details_operation_form.move_line_ids.edit(0) as ml: ml.qty_done = 1 ml.lot_id = serials_p1[i] details_operation_form.save() details_operation_form = Form(active_production.move_raw_ids.filtered(lambda m: m.product_id == p2), view=self.env.ref('stock.view_stock_move_operations')) with details_operation_form.move_line_ids.edit(0) as ml: ml.qty_done = 1 ml.lot_id = serials_p2[i] details_operation_form.save() production_form = Form(active_production) production_form.qty_producing = 1 production_form.lot_producing_id = serials_final[i] active_production = production_form.save() active_production.button_mark_done() if i + 1 != nb_product_todo: # If last MO, don't make a backorder action = active_production.button_mark_done() backorder = Form(self.env['mrp.production.backorder'].with_context(**action['context'])) backorder.save().action_backorder() active_production = active_production.procurement_group_id.mrp_production_ids[-1] self.assertEqual(self.env['stock.quant']._get_available_quantity(p_final, self.stock_location), nb_product_todo, f'You should have the {nb_product_todo} final product in stock') self.assertEqual(len(production.procurement_group_id.mrp_production_ids), nb_product_todo) class TestMrpWorkorderBackorder(SavepointCase): @classmethod def setUpClass(cls): super(TestMrpWorkorderBackorder, cls).setUpClass() cls.uom_unit = cls.env['uom.uom'].search([ ('category_id', '=', cls.env.ref('uom.product_uom_categ_unit').id), ('uom_type', '=', 'reference') ], limit=1) cls.finished1 = cls.env['product.product'].create({ 'name': 'finished1', 'type': 'product', }) cls.compfinished1 = cls.env['product.product'].create({ 'name': 'compfinished1', 'type': 'product', }) cls.compfinished2 = cls.env['product.product'].create({ 'name': 'compfinished2', 'type': 'product', }) cls.workcenter1 = cls.env['mrp.workcenter'].create({ 'name': 'workcenter1', }) cls.workcenter2 = cls.env['mrp.workcenter'].create({ 'name': 'workcenter2', }) cls.bom_finished1 = cls.env['mrp.bom'].create({ 'product_id': cls.finished1.id, 'product_tmpl_id': cls.finished1.product_tmpl_id.id, 'product_uom_id': cls.uom_unit.id, 'product_qty': 1, 'consumption': 'flexible', 'type': 'normal', 'bom_line_ids': [ (0, 0, {'product_id': cls.compfinished1.id, 'product_qty': 1}), (0, 0, {'product_id': cls.compfinished2.id, 'product_qty': 1}), ], 'operation_ids': [ (0, 0, {'sequence': 1, 'name': 'finished operation 1', 'workcenter_id': cls.workcenter1.id}), (0, 0, {'sequence': 2, 'name': 'finished operation 2', 'workcenter_id': cls.workcenter2.id}), ], }) cls.bom_finished1.bom_line_ids[0].operation_id = cls.bom_finished1.operation_ids[0].id cls.bom_finished1.bom_line_ids[1].operation_id = cls.bom_finished1.operation_ids[1].id
/** * All Internal Functions need to be registered in Internal::Bind() */ TOKEN Semantics::ValidateInternal(ASTFuncCall* call){ CHECK(call != NULL); TRACK(call); TOKEN t = Internal::Bind(call); for (unsigned int i = 0; i < call->params.size(); i++) ValidateExpr(call->params[i]); Trace(" Type: ", "Internal"); return t; }
/** * Copyright (c) 2021 OceanBase * OceanBase Database Proxy(ODP) is licensed under Mulan PubL v2. * You can use this software according to the terms and conditions of the Mulan PubL v2. * You may obtain a copy of Mulan PubL v2 at: * http://license.coscl.org.cn/MulanPubL-2.0 * THIS SOFTWARE IS PROVIDED ON AN "AS IS" BASIS, WITHOUT WARRANTIES OF ANY KIND, * EITHER EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO NON-INFRINGEMENT, * MERCHANTABILITY OR FIT FOR A PARTICULAR PURPOSE. * See the Mulan PubL v2 for more details. */ #ifndef _OB_MALLOC_ALLOCATOR_H_ #define _OB_MALLOC_ALLOCATOR_H_ #include "lib/allocator/ob_allocator.h" #include "lib/alloc/ob_tenant_allocator.h" #include "lib/alloc/alloc_func.h" #include "lib/lock/tbrwlock.h" namespace oceanbase { namespace common { class ObModItem; } // end of namespace common namespace lib { struct ObTenantMemory { uint64_t tenant_id_; int64_t limit_; int64_t hold_; common::ObModItem item_; }; // It's the implement of ob_malloc/ob_free/ob_realloc interface. The // class separates allocator for each tenant thus tenant vaolates each // other. class ObMallocAllocator : public common::ObIAllocator { static const uint64_t PRESERVED_TENANT_COUNT = 10000; public: ObMallocAllocator(); virtual ~ObMallocAllocator(); int init(); void *alloc(const int64_t size); void *alloc(const int64_t size, const ObMemAttr &attr); void *realloc(const void *ptr, const int64_t size, const ObMemAttr &attr); void free(void *ptr); int set_root_allocator(ObTenantAllocator *allocator); static ObMallocAllocator *get_instance(); int create_tenant_allocator(uint64_t tenant_id); int delete_tenant_allocator(uint64_t tenant_id); // statistic relating void set_urgent(int64_t bytes); int64_t get_urgent() const; void set_reserved(int64_t bytes); int64_t get_reserved() const; int set_tenant_limit(uint64_t tenant_id, int64_t bytes); int64_t get_tenant_limit(uint64_t tenant_id) const; int64_t get_tenant_hold(uint64_t tenant_id) const; void get_tenant_mod_usage(uint64_t tenant_id, int mod_id, common::ObModItem &item) const; int64_t get_mod_dist(int mod_id, ObTenantMemory tenant_memory[], int64_t count); void print_tenant_memory_usage(uint64_t tenant_id) const; private: DISALLOW_COPY_AND_ASSIGN(ObMallocAllocator); ObTenantAllocator *get_tenant_allocator(uint64_t tenant_id) const; private: obsys::CRWLock locks_[PRESERVED_TENANT_COUNT]; ObTenantAllocator *allocators_[PRESERVED_TENANT_COUNT]; int64_t reserved_; int64_t urgent_; static ObMallocAllocator *instance_; }; // end of class ObMallocAllocator } // end of namespace lib } // end of namespace oceanbase #endif /* _OB_MALLOC_ALLOCATOR_H_ */
package com.tairanchina.csp.avm.mapper; import com.baomidou.mybatisplus.mapper.BaseMapper; import com.tairanchina.csp.avm.entity.AndroidVersion; import org.apache.ibatis.annotations.Mapper; /** * Created by hzlizx on 2018/5/17 0017 */ @Mapper public interface AndroidVersionMapper extends BaseMapper<AndroidVersion> { }
import { CommonModule } from '@angular/common'; import { NgModule } from '@angular/core'; import { FormsModule } from '@angular/forms'; import { Http, HttpModule, RequestOptions, XHRBackend } from '@angular/http'; import { RouterModule } from '@angular/router'; import { BsDropdownConfig, BsDropdownModule, CollapseModule, TooltipConfig, TooltipModule } from 'ngx-bootstrap'; import { AuthenticationService } from 'ngx-login-client'; import { AlmEditableModule, AlmIconModule, MarkdownModule, WidgetsModule } from 'ngx-widgets'; import { SafePipeModule } from '../../pipes/safe.module'; import { factoryForHttpService, HttpService } from './../../services/http-service'; import { GlobalSettings } from './../../shared/globals'; import { CommentModule } from './../../widgets/comment-module/comment.module'; import { UserAvatarModule } from './../../widgets/user-avatar/user-avatar.module'; import { WorkItemCommentComponent } from './work-item-comment.component'; let providers = [ { provide: HttpService, useFactory: factoryForHttpService, deps: [XHRBackend, RequestOptions, AuthenticationService] }, GlobalSettings, TooltipConfig, BsDropdownConfig ]; @NgModule({ imports: [ AlmEditableModule, AlmIconModule, CollapseModule, CommonModule, CommentModule, BsDropdownModule, FormsModule, MarkdownModule, RouterModule, HttpModule, TooltipModule, WidgetsModule, SafePipeModule, UserAvatarModule ], declarations: [ WorkItemCommentComponent ], exports: [ WorkItemCommentComponent ], providers: providers }) export class WorkItemCommentModule { }
Here are the 1/4 scale figure from NECA at San Diego Comic-Con 2016. They were finally able to share the highly anticipated TMNT 1990 figures. They also had a couple of surprises, such as a second version of the Penguin. Included in the gallery are: Teenage Mutant Ninja Turtles – 1990 Movie – Leonardo Teenage Mutant Ninja Turtles – 1990 Movie – Donatello Teenage Mutant Ninja Turtles – 1990 Movie – Michelangelo Teenage Mutant Ninja Turtles – 1990 Movie – Raphael Batman: Arkham City – Harley Quinn Batman v Superman – Batman Batman Begins – Batman Iron Man 3 – Iron Man Mark 42 Avengers: Age of Ultron – Hulk Avengers: Age of Ultron – Iron Man Mark 43 Marvel Comics – Deadpool Batman Returns – “Mayor” Penguin Batman Returns – Catwoman
// Bandwidth returns the upper and lower bandwidths of the matrix. func (t *TriBandDense) Bandwidth() (kl, ku int) { if t.isUpper() { return 0, t.mat.K } return t.mat.K, 0 }
Last week’s horrific terrorist attacks in Paris were a reminder that, unfortunately, the ongoing fight with global terrorism is still with us, and my thoughts are with those affected by the tragedy. In a new paper, “After the Paris Tragedy,” the BlackRock Investment Institute recently shared some insights on how this significant escalation in the intensity of the terrorist threat could impact the economy and markets going forward. Here are three of the key points. Episodes of terror tend to accentuate existing economic and market trends, rather than constitute turning points. The market impact of past terror episodes, including the 2005 London bombings and 2004 train bombing in Spain, was generally ephemeral, according to a BlackRock analysis of data accessible via Bloomberg. In other words, the attacks created headlines, become market talking points and then disappeared. If anything, they heightened the trajectory of the economy and markets. For instance, the September 11 attacks magnified the 2001 U.S. tech bubble recession, but didn’t cause it. Similarly, in the near term, the tragedy in Paris is likely to bolster economic trends already underway in Europe. The fallout from these attacks will probably put some pressure on the French economy as well as the euro. A likely (but reversible) dent in European consumer confidence should cement plans by the European Central Bank (ECB) for more easing measures in December. Meanwhile, France will likely further loosen the austerity reins to boost defense spending, with other European Union (E.U.) members following suit. With the Islamic State (ISIS) and its affiliates set on internationalizing their reach, the terrorist threat against the West is likely the highest it has been since 2001. If the Paris tragedy marks an evolution toward an extended, global conflict, the market impact could be much greater than past terror attacks stemming from regional conflicts.
/** * Created by chenmuen on 2017/3/11. */ @Controller @RequestMapping("/stat") public class StatController extends BaseController { @Autowired StatService statService; @RequestMapping("/student") public String studentStat(HttpSession session, Model model) { String userId = getUserId(session); int courseCount = statService.getStudentCourseCount(userId); int scoreAvg = statService.getStudentAverageScore(userId); double consume = statService.getStudentConsume(userId); model.addAttribute("courseCount", courseCount); model.addAttribute("scoreAvg", scoreAvg); model.addAttribute("consume", consume); return "student-stat"; } @RequestMapping("/organ") public String organStat(HttpSession session, Model model) { String userId = getUserId(session); double income = statService.getOrganIncome(userId); int courseCount = statService.getOrganCourseCount(userId); int memberCount = statService.getOrganMemberCount(userId); model.addAttribute("income", income); model.addAttribute("courseCount", courseCount); model.addAttribute("memberCount", memberCount); return "organ-stat"; } @RequestMapping("/site") public String siteStat(HttpSession session, Model model) { double income = statService.getSiteIncome(); int courseCount = statService.getSiteCourseCount(); int memberCount = statService.getSiteMemberCount(); model.addAttribute("income", income); model.addAttribute("courseCount", courseCount); model.addAttribute("memberCount", memberCount); return "site-stat"; } /*------------- Action -------------*/ @RequestMapping("/student/score/bar") @ResponseBody public List studentScore(@RequestParam String studentId) { List<Integer> scoreList = statService.getStudentScoreDistribution(studentId); return scoreList; } @RequestMapping("/student/course/line") @ResponseBody public List studentCourse(@RequestParam String studentId) { return statService.getStudentCourseLine(studentId); } @RequestMapping("/organ/income/line") @ResponseBody public List organIncome(@RequestParam String organId) { List<Double> incomeList = statService.getOrganIncomeLine(organId); return incomeList; } @RequestMapping("/organ/member/line") @ResponseBody public List organMember(@RequestParam String organId) { List<Integer> incomeList = statService.getOrganMemberLine(organId); return incomeList; } @RequestMapping("/organ/member/bar") @ResponseBody public Map organRate(@RequestParam String organId) { Map<String, Object> map = statService.getOrganTopCourse(organId); return map; } @RequestMapping("/site/income/line") @ResponseBody public List siteIncome() { List<Double> incomeList = statService.getSiteIncomeLine(); return incomeList; } @RequestMapping("/site/member/line") @ResponseBody public List siteMember() { List<Integer> memberList = statService.getSiteMemberLine(); return memberList; } @RequestMapping("/site/organ/bar") @ResponseBody public Map siteCourse() { Map map = statService.getSiteTopOrgan(); return map; } }
The SESA network links duplication of the yeast centrosome with the protein translation machinery. The yeast spindle pole body (SPB), the functional equivalent of mammalian centrosome, duplicates in G1/S phase of the cell cycle and then becomes inserted into the nuclear envelope. Here we describe a link between SPB duplication and targeted translation control. When insertion of the newly formed SPB into the nuclear envelope fails, the SESA network comprising the GYF domain protein Smy2, the translation inhibitor Eap1, the mRNA-binding protein Scp160 and the Asc1 protein, specifically inhibits initiation of translation of POM34 mRNA that encodes an integral membrane protein of the nuclear pore complex, while having no impact on other mRNAs. In response to SESA, POM34 mRNA accumulates in the cytoplasm and is not targeted to the ER for cotranslational translocation of the protein. Reduced level of Pom34 is sufficient to restore viability of mutants with defects in SPB duplication. We suggest that the SESA network provides a mechanism by which cells can regulate the translation of specific mRNAs. This regulation is used to coordinate competing events in the nuclear envelope.
Salvage Chemotherapy Following Osimertinib in Non-small Cell Lung Cancer Harboring Epidermal Growth Factor Receptor Mutation Aim: The current study reports the type of salvage chemotherapy following osimertinib and its treatment efficacy in patients with non-small-cell lung carcinoma (NSCLC) who acquire resistance to osimertinib. Patients and Methods: In this retrospective cohort study, data from the medical charts of 40 patients with NSCLC treated with osimertinib were obtained, primarily focusing on 14 undergoing salvage chemotherapy including epidermal growth factor receptor-tyrosine kinase inhibitor (EGFR-TKI) or cytotoxic agents immediately following osimertinib. The treatment efficacy of salvage chemotherapy was evaluated. Results: Five and nine patients received EGFR-TKI and cytotoxic agents following osimertinib, respectively. The overall response rate to EGFR-TKI treatment following osimertinib tended to be lower than that for cytotoxic agents (0% vs. 44.4%). The median progression-free-survival was significantly poorer in patients receiving EGFR-TKI treatment than in those receiving cytotoxic agents. Conclusion: Cytotoxic agent administration should be considered more frequently than EGFR-TKIs for patients with NSCLC resistant to osimertinib.
<filename>section_2_video_4.ts fetch('https://jsonplaceholder.typicode.com/posts/1') .then((response) => { return response.json(); }) .then((json) => { console.log(json) }); const fetchPost1 = fetch('https://jsonplaceholder.typicode.com/posts/1'); const fetchPost2 = fetch('https://jsonplaceholder.typicode.com/posts/2'); const fetchPost3 = fetch('https://jsonplaceholder.typicode.com/posts/3'); Promise.all([fetchPost1, fetchPost2, fetchPost3]) .then(() => { console.log('done'); }); function* infiniteLoop() { let i: number = 0; while (true) { yield i++; } } const iLoop = infiniteLoop(); console.log(iLoop.next()); console.log(iLoop.next()); console.log(iLoop.next());
import React from 'react'; interface TagsTextFieldProps { onChange: (value: string[]) => void; onBlur?: (event: React.FocusEvent<HTMLInputElement>) => void; readOnly?: boolean; markers?: unknown[]; value?: string[]; inputId?: string; } interface State { inputValue: string; } export default class TagsTextField extends React.Component<TagsTextFieldProps & Omit<React.HTMLProps<HTMLInputElement>, 'onBlur' | 'onChange' | 'value'>, State> { _input: HTMLInputElement | null; state: State; addTag(tagValue: string): void; removeTag(index: number): void; addAndClearInput(tagValue: string): void; handleRemoveTagClick: (event: React.MouseEvent<HTMLAnchorElement>) => void; handleKeyDown: (event: React.KeyboardEvent<HTMLInputElement>) => void; handleKeyPress: (event: React.KeyboardEvent<HTMLInputElement>) => void; handleBlur: (event: React.FocusEvent<HTMLInputElement>) => void; handleInputChange: (event: React.ChangeEvent<HTMLInputElement>) => void; focus(): void; setInput: (el: HTMLInputElement | null) => void; render(): JSX.Element; } export {}; //# sourceMappingURL=TextField.d.ts.map
Latest News Want to understand how bee-killing neonicotinoids (a class of insecticide) work in less than two minutes, and why you should care that the EPA does nothing to reverse the damage that these pesticides have done to our pollinating insects? Watch this brief video that explains it all. Dr. Keith Tyrell explains how this new class of pesticides, neonicotinoids, which are considered “new” in that they have only been on the market for about 20 years, are taken up by plants as they grow. These ‘neonics’ are not like old pesticides because they become part of the plant itself, making it toxic. (Neonics are taken up by the roots or leaves and taken to all other parts of the plant.) Tyrell’s summary is a brief insight into why neonics are ‘bad for everything.’ The European Union has imposed a two-year moratorium on all neonics, but the US still allows them to be sprayed everywhere. In fact, the EPA has decided to allow more of these bee- and butterfly-killing chemicals to enter the environment despite clear dangers. In fact, they are the fastest growing class of pesticides in the United States. This, even though imidacloprid and acetamiprid – two types of neonics – could possibly be impairing the developing human nervous system. What’s worse – one study conducted by the U.S. Geological Survey found that neonics are widespread contaminants of groundwater which many people use to drink or bathe in. In nine rivers monitored in the Midwest, where neonics are most heavily used, the study found clothianidin in about three-quarters of monitored sites, thiamethoxam in about one-half, and imidacloprid in about one-quarter If numerous communities banning neonics due to pollinator-deaths, articles reporting on how the chemicals are killing millions bees, and 100+ organizations urging Obama to take action against the chemicals isn’t enough for the EPA to take action, then I’m not sure what it will take. If you haven’t yet taken the time to understand neonics, I urge you to take two minutes and do it now. Additional Sources: Featured image credit: PakalertPress
Estimating R-Process Yields from Abundances of the Metal-Poor Stars The chemical abundances of metal-poor stars provide important clues to explore stellar formation history and set significant constraints on models of the r-process. In this work, we find that the abundance patterns of the light and iron group elements of the main r-process stars are very close to those of the weak r-process stars. Based on a detailed abundance comparison, we find that the weak r-process occurs in supernovae with a progenitor mass range of ∼1126 M⊙. Using the SN yields given by Heger & Woosley and the abundances of the weak r-process stars, the weak r-process yields are derived. The SNe with a progenitor mass range of 15 M⊙ < M < 26 M⊙ are the main sites of the weak r-process, and their contributions are larger than 80%. Using the abundance ratios of the weak r-process and the main r-process in the solar system, the average yields of the main r-process are estimated. The observed correlations of versus can be explained by mixing of the two r-process abundances in various fractions.
Prior art methods for producing a black surface coating involve such procedures as depositing on the substrate a coating of black paint, a coating of black surface oxides or metallic compounds, a black coating of metal alloys, or a black coating of mixed metals. A number of prior methods have been used to blacken areas to produce what is referred to as "optical black" surfaces. In the early work of Decker described in U.S. Pat. No. 3,867,207 he describes a process of blackening a component by using an electroless plating bath selected from the group consisting of nickel and cobalt and after subsequent rinsing immersing in an oxidizing acid bath of phosphoric, sulfuric and nitric acids and thereafter firing the component to form the blackened surface. More recently a process for producing black surface coatings has been described in U.S. Pat. No. 4,233,107 as well as U.S. Pat. No. 4,361,630 to Johnson, Sr., which involves preparing a substrate by cleaning and/or activating it and immersing the thus prepared substrate in an electroless plating bath containing nickel and hypophosphite ions in solution until an electroless nickel-phosphorus alloy coating has been deposited on the substrate. Thereafter, the substrate, coated with the electroless nickel-phosphorus alloy which has been washed and dried, is immersed in in an etchant bath consisting of an aqueous solution of nitric acid, wherein the nitric acid concentration ranges from a 1:5 a ratio with distilled or deionized water to concentrated until the coated surface of the substrate develops ultra-blackness. While the process described in the aforementioned Johnson, Sr., patent provides a highly blackened surface on a substrate, it has been found that the emissivity capabilities thereof are limited such that its primary use is as a solar collector in the field of solar energy. In contradistinction, the present invention is not so limited in that substrates produced in accordance with the present invention have a highly blackened surface characterized by high infrared emissivities, as well as high absorptive capabilities, thus making them extremely suitable for use in such devices as infrared telescopes and sensors.
Extinctions in the random replicator model The statistical properties of an ecosystem composed of species interacting via pairwise, random interactions and deterministic, concentration limiting self-interaction are studied analytically with tools of equilibrium statistical mechanics of disordered systems. Emphasis is given to the effects of externally induced extinction of a fixed fraction of species at the outset of the evolutionary process. The manner the ecosystem copes with the initial extinction event depends on the degree of competition among the species as well as on the strength of that event. For instance, in the regime of high competition the ecosystem diversity, given by the fraction of surviving species, is practically insensitive to the strengt of the initial extinction provided it is not too large, while in the less competitive regime the diversity decreases linearly with the size of the event. In the case of large extinction events we find that no further biotic extinctions take place and, furthermore, that rare species become very unlikely to be found in the ecosystem at equilibrium. In addition, we show that the reciprocal of the Edwards-Anderson order parameter yields a good measure of the diversity of the model ecosystem. Introduction Extinction seems to be the final outcome of the evolution of species. In fact, as species survive about 10 million years in the average, nearly all species that have ever existed are extinct and only a very small fraction of them have left their impressions in the fossil record. The causes of the mass extinction events is currently a matter of dispute as there are two main types of explanations. The more traditional one asserts that extinction is caused by external stresses as, for instance, major climate changes and asteroids impacts. This point of view is supported by some evidences such as the unusual quantity of iridium and other noble metals in the rocks that marked the boundary between the Cretaceous and Tertiary periods, when the era of the dinosaurs was replaced by the era of the mammals. Since iridium is more common in asteroids than in the Earth's crust, this finding can be viewed as evidence for an asteroid impact. The alternative explanation asserts that extinctions are caused by the interactions between the species in the ecosystems. In particular, Paine has shown that species richness sometimes can be increased by the predator-mediated coexistence, and the removal of predators can lead to additional species extinctions. Some recent studies indicate that food webs with many species or high connectivity are more likely to lose species as a consequence of the extinction of a single species when compared with more simple food webs. Although this kind of argument seems well suited to explain the so-called background extinctions, it certainly needs some new ingredients to explain mass extinctions as well. In fact, the missing ingredient seems to be the self-organized criticality concept, which in this context is best illustrated by the popular Bak-Sneppen model. According to this model, the fitness of each species is affected by the other species to which it is coupled in the ecosystem so that large events in the evolutionary history may be thought of as large coevolutionary avalanches caused by the intrinsic dynamics of the model. In this model, the distribution of the extinction sizes follows a power law, which is a valid candidate for fitting to the experimental data. Although we recognize that evolution and hence extinction are, as pictured by the models mentioned above, essentially dynamical phenomena, in this work we study these phenomena within the equilibrium statistical mechanics framework of the random replicator model for species coevolution. Deterministic replicator models are commonly used to describe the evolution of self-reproducing entities in a variety of fields such as game theory, prebiotic evolution and sociobiology. The random replicator model introduced by Diederich and Opper attempts to model the uncertainties and the overwhelming complexity of the interspecies interactions in biological ecosystems by assuming that those interactions are random. However, it is also assumed that the dynamics is such that a fitness functional (Lyapunov function) is maximized so that the only stationary states are fixed points. In fact, the existence of such a functional leads to a replicator equation with symmetric interspecies interactions which is a severe assumption from the biological standpoint. It allows, however, full use of tools of the equilibrium statistical mechanics to study analytically the average properties of the equilibrium states of this kind of disordered ecosystems. An interesting result of the random replicator model is that in the equilibrium state a fraction of the species is extinct. The mechanism of extinction is clearly outcompetition and, in the absence of any cooperation pressure, only the pair of species with the largest reinforcing interactions will thrive. In this contribution we study the effects of random elimination of a fixed fraction of the species at the outset of the evolutionary process, giving emphasis to the distribution of the remaining species concentrations in the equilibrium state. There are several interesting issues that can be addressed in this framework. For instance, what is the equilibrium situation when the fraction of species eliminated at the beginning is already larger than the fraction that would be extinct naturally due to outcompetition? Furthermore, how does the ecosystem cope with large initial extinction events? In this paper we give clear-cut analytic answers to these questions which are partly corroborated by numerical simulations of the model ecosystem. The remainder of the paper is organized as follows. In Sec. 2 we introduce the model and discuss the ecological interpretation of the control parameters. The equilibrium properties of the model are derived within the replica-symmetric framework in Sec. 3 and the use of the reciprocal of the Edwards-Anderson order parameter as a measure for the diversity of the ecosystem is suggested. In Sec. 4 we calculate the distribution of probability of the concentrations of a given species, allowing thus the explicit calculation of the ecosystem diversity as the fraction of surviving species at equilibrium. Finally, in Sec. 5 we present some concluding remarks. Model We consider an infinite population (ecosystem) composed of individuals belonging to N different species whose fitness F i (i = 1,..., N ) are the derivatives F i = ∂F /∂x i of the fitness functional F defined as is the fraction of species i and b i is a quenched random variable that takes the values 0 and 1 with probabilities a and 1 − a, respectively. Hence N a randomly chosen species are eliminated at the outset in the average and so henceforth we will refer to a ∈ as the dilution parameter. An effective competition among the species is enforced by requiring that the concentrations of the surviving species satisfy the constraint where Q 0 is an arbitrary positive constant which gives the scale of the concentrations x i. The coupling strengths J ij between species i and j are statistically independent quenched random variables with a Gaussian distribution so that J ij < 0 corresponds to pairs of cooperating species while J ij > 0 to pairs of competing species. The self-interaction parameter u ≥ 0 acts as a global cooperation pressure limiting the growth of any single species, and it is crucial to guarantee the existence of a nontrivial thermodynamic limit, N → ∞. In fact, for large u the minimum of H corresponds to a homogeneous ecosystem where the surviving The positive self-interactions means that individuals of a same species compete against themselves, which is quite reasonable as they certainly share the same resources (ecological niche). The time evolution of the species concentrations is given by the replicator equa- Hence the fixed points of this equation are the minima of H(x) and in the following we use the replica formalism to study analytically the statistical properties of these minima. The replica approach Following the standard prescription of performing quenched averages on extensive quantities only, we define the average free-energy density f as is the partition function and = 1/T is the inverse temperature. Taking the limit T → 0 in Eq. ensures that only the states that minimize H (x) will contribute to Z. We impose the additional constraint to avoid divergences when carrying out the integrals over x i. Here... stands for the average over the coupling strengths J ij as well as over the auxiliary variables b i. As usual, the evaluation of the quenched average in Eq. can be carried out through the replica method: using the identity we first calculate Z n for integer n, i.e., Z n = n =1 Z, and then analytically continue to n = 0. The final result is where P 0 = a, P 1 = 1 − a, and We note that while we have calculated the average over the couplings J ij explicitly, we have used the self-averaging property 1 N i ln G 0 (b i ) = b P b ln G 0 (b) to eliminate the site dependence of the b i variables. The relevant physical order parameters are which measure the overlap between a pair of different equilibrium states x and x, and the overlap of an equilibrium state x with itself, respectively. Here,... T stands for a thermal average taken with the Gibbs probability distribution To proceed further we assume that the saddle-point parameters are symmetric under permutations of the replica indices, i.e., p = p,p =p, q = q,q = q,R =R andQ =Q. With this prescription the evaluation of Eq. is straightforward yielding the following replica-symmetric free energy density where y = (p−q) and Dz = dz exp(−z 2 /2)/ √ 2 is the Gaussian measure. Already at this stage we can see that the concentration of species eliminated at the outset, given by the parameter Q, decouples from the other physical parameters and hence does not have any effect upon them. In the zero-temperature limit the saddle-point equations ∂f /∂q = 0, ∂f /∂y = 0 and ∂f /∂R = 0 are given by and We note that the parameter associated to the concentration of surviving species Q 0 appears only as a scale of q and so henceforth we will set Q 0 = 1 without loss of generality. In the replica-symmetric framework the Edwards-Anderson order parameter q is defined by If the concentrations x i were normalized to 1 rather than to N then q would give the probability that two randomly selected individuals are of the same species, a quantity known as Simpson's index. Nevertheless, we can still give a simple physical interpretation to q. For instance, values of q of order of 1 indicate the coexistence of a macroscopic number of species (i.e., x i ≈ 1 for an extensive number of species), while large values of q signalize the dominance of a few species only (i.e., x i ≈ N for a finite number of species). Of course, this interpretation is equivalent to that given above for Simpson's index, and so we can view 1/q as a measure of the diversity of the ecosystem. In Fig. 1 we present 1/q as a function of the dilution parameter a for several values of the cooperation pressure u. The results of the numerical solution of the replicator equation, Eq., for N = 500 are also presented. Each data point is the average over 100 realizations of the matrix of coupling strengths, starting with an uniform distribution of concentrations. Since the labeling of the species is arbitrary we can set b i = 0 for i ≤ aN and b i = 1 otherwise, without loss of generality. In addition, we choose a such that aN is integer for simplicity. In agreement with our interpretation, for a close to 1 we can observe the vanishing of 1/q, which characterizes an ecosystem composed of a few species only. For small values of u the analytical results show the existence of a maximum of diversity for a nonzero value of the dilution parameter (see the inset in Fig. 1); the numerical results however do not corroborate this finding. This In fact, carrying out the standard local stability analysis, we find that this solution is locally stable wherever the condition is satisfied. Figure 2 shows the regions in the plane (a, u) where the replicasymmetric solution is stable. In particular, we find that for a = 0 this solution is stable for u > 1/ √ 2 while for a = 1 it is stable for u > 1/2. Hence, the maxima observed in Fig. 1 are indeed artifacts of the replica-symmetry framework. Nevertheless, the agreement between the analytical and numerical results is already excelent for u > 0.6. The rather puzzling independence of the diversity on the dilution parameter for small u has a simple explanation as will be seen in the next section. Discussion Although the interpretation of the reciprocal of the Edwards-Anderson order parameter as the ecosystem diversity yields some information on the distribution of species at equilibrium, a better understanding is achieved by calculating explicitly the probability distribution that the concentration of one of the (1 − a)N remaining species, say x k, assumes the value x, defined by with W (x) given by Eq.. As all non-vanishing species concentrations are equivalent we can write P k (x) = P (x) ∀k. Hence to evaluate Eq. we introduce the auxiliary energy so that where Z aux is the partition function with H replaced by H aux. Using Eq. the calculations needed to evaluate P (x) become analogous to those used in the evaluation of the free-energy density. In addition, to handle a possible singularity in the limit → ∞ it is more convenient to deal with the cumulative distribution function C (x) = x 0 dx P (x ). Carrying out the calculations within the replica-symmetric framework we obtain where q, y and ∆ are given by the saddle-point equations -. In Fig. 3 we show C(x) for u = 0.8 and several values of a. The first point to note is that lim x→∞ C(x) = 1 − a yields the fraction of surviving species at the outset, as expected. In addition, a nonzero value of C indicates that the probability distribution P(x) has a delta peak at x = 0 and so C actually yields the fraction of the species that survived the initial externally induced extinction event but that were extinct later on due to outcompetition. In the regime of large dilution, say a > 0.8 in Fig. 3, the cumulative distribution is very small and practically constant for small concentrations, indicating that no further extinctions have taken place and, furthermore, that rare species are very unlikely to be found in the ecosystem at equilibrium. We note that the numerical simulations yield results practically indistinguishable from the analytical ones. The rough independence of the diversity 1/q on the dilution parameter a observed in Fig. 1 for small u is easily understood with the aid of the cumulated distributions. In fact, a direct measure of the ecosystem diversity is given by the fraction of surviving species 1 − a − C, which is shown in Fig. 4 as function of a. (We recall that a is the fraction of species that were extinct at the outset due to some external stress and C is the fraction that died out due to outcompetition.) The remarkable similarity between these figures corroborates our interpretation of 1/q as a measure of the diversity. Clearly, the diversity is insensitive to variations of a whenever the fraction of extinct species in the undisturbed ecosystem (i.e. C calculated at a = 0) is already considerably larger than a, so that the species eliminated at the outset would probably be extinct later on anyway. Conclusion Although the dynamics of the random replicator model may not look very appealing, in the sense that it always leads to fixed points, the frustration caused by the competition between the concentration limiting self-interactions (u > 0) and the tendency to unlimited growth of pairs of strongly cooperative (J ij < 0) species results in a highly nontrivial equilibrium, characterized by many meta-stable states and a phase of replica symmetry breaking. Of course, these very features make some aspects of the dynamics (e.g., slow relaxation and hysteresis effects) nontrivial as well. The wealth of ecologically relevant issues that can be addressed within this equilibrium framework can be appreciated, for instance, in the case of high-order interactions among the species where it has been reported the emergence of a threshold value which gives a lower bound to the concentration of the surviving species, preventing then the existence of rare (low concentration) species in the ecosystem. An important outcome of the equilibrium analysis of the random replicator model is the finding that in order to reduce the degree of frustration a fraction of the species dies out. This type of extinction has clearly a biotic cause, namely, outcompetition. In this paper we study how the model ecosystem copes with abiotic or externally induced extinction, in which a fraction of randomly chosen species is eliminated at the beginning of the coevolutionary process. We find that in the regime of high competition (small u) the ecosystem diversity, i.e., the fraction of surviving species is practically insensitive to the strength a of the initial extinction provided it is not too large, while in the less competitive regime (large u) the diversity decreases linearly with increasing a. In the case of a large extinction event we find that no further (biotic) extinctions take place and, furthermore, that rare species become very unlikely to be found in the ecosystem at equilibrium. This is distinct from the result mentioned above for the case of high-order interactions where the probability of finding rare species in the ecosystem is strictly null. An interesting by-product of our investigation is the finding that the reciprocal of the Edwards-Anderson order parameter (i.e., the replica-symmetric overlap between two equilibrium states) serves as an easy-to-calculate measure of the diversity of the model ecosystem. This opens the exciting possibility of interpreting the different hierarquical levels of the overlap order parameter in the full replica symmetry breaking scheme as different levels of a phylogenetic tree that gives the relations of dependence (viewed as ancestrality) among the species.
Cardiopathy of aging: are the changes related to congestive heart failure? Over the years, hemodynamic stresses and biologic changes bring about reduced cardiac function. The addition of one or more types of organic heart disease leds to further deterioration of fuction. This is why elderly patients require special consideration and management, why their clinical manifestations and therapeutic responses differ from those in young patients. Although no single abnormality characterizes the aging process, cellular, functional, and structural changes support the existence of a cardiopathy. However, there are insufficient data to link so-called senile cardiopathy directly to otherwise unexplained heart failure. Failure is usually due to the typical reasons, i.e., coronary artery or valvular disease, hypertension, amyloidosis, and chronic pulmonary lesions. Nevertheless, the possibility of senile heart failure should not be overlooked in case of impending or actual myocardial failure. In patients over 60, edema, dyspnea, or tachycardia cannot always be attributed to heart disease. It is hazardous to diagnose and prescribe treatment for cardiac failure if the heart shadow is not enlarged on the x-ray,the circulation time is not prolonged, and the heart sounds and rhythmare normal. Other reasons for the complaints should be looked for, even when the heart rate is fast.
Infection with Conditionally Virulent Streptococcus pneumoniae pab Strains Induces Antibody to Conserved Protein Antigens but Does Not Protect against Systemic Infection with Heterologous Strains ABSTRACT Avirulent strains of a bacterial pathogen could be useful tools for investigating immunological responses to infection and potentially effective vaccines. We have therefore constructed an auxotrophic TIGR4 pab strain of Streptococcus pneumoniae by deleting the pabB gene Sp_0665. The TIGR4 pab strain grew well in complete medium but was unable to grow in serum unless it was supplemented with para-aminobenzoic acid (PABA). The TIGR4 pab strain was markedly attenuated in virulence in mouse models of S. pneumoniae nasopharyngeal colonization, pneumonia, and sepsis. Supplementing mouse drinking water with PABA largely restored the virulence of TIGR4 pab. An additional pab strain constructed in the D39 capsular serotype 2 background was also avirulent in a sepsis model. Systemic inoculation of mice with TIGR4 pab induced antibody responses to S. pneumoniae protein antigens, including PpmA, PsaA, pneumolysin, and CbpD, but not capsular polysaccharide. Flow cytometry demonstrated that IgG in sera from TIGR4 pab-vaccinated mice bound to the surface of TIGR4 and D39 bacteria but not to a capsular serotype 3 strain, strain 0100993. Mice vaccinated with the TIGR4 pab or D39 pab strain by intraperitoneal inoculation were protected from developing septicemia when challenged with the homologous S. pneumoniae strain. Vaccination with the TIGR4 pab strain provided only weak or no protection against heterologous challenge with the D39 or 0100993 strain but did strongly protect against a TIGR4 capsular-switch strain expressing a serotype 2 capsule. The failure of cross-protection after systemic vaccination with pab bacteria suggests that parenteral administration of a live attenuated vaccine is not an attractive approach for preventing S. pneumoniae infection.
Wound healing in tissues is a complex reparative process. Under normal circumstances, the process of acute wound healing can be broken down into three phases. An initial inflammatory phase, which is followed by robust tissue remodeling and proliferation (the proliferative phase), is succeeded by a maturational phase wherein re-epithelialization, dermal angiogenesis and wound closure ensues. Re-epithelialization involves the migration and proliferation of epithelial tissue, primarily keratinocytes. Angiogenesis is the growth of new blood vessels from pre-existing conduits, and is regulated by a panoply of soluble cytokines including growth factor polypeptides, as well as cell-cell and cell-matrix interactions. Chronic wounds exhibit a different healing profile from normal acute wounds in that they generally remain in an inflamed state for protracted periods of time. Non-healing wounds can most commonly be observed amongst people with diabetes, venous stasis disease, and in those patients who are immobilized. In view of the foregoing, it would be desirable to provide new biomolecules that safely and efficiently potentiate epithelial and vascular wound healing mechanisms in both acute and chronic wound healing situations. Drugs for promoting wound healing have been recently developed such as Beclapermin, a genetically engineered recombinant PDGF from Johnson & Johnson, or a pharmaceutical composition for regeneration and repair of mammalian tissues comprising PDGF and dexamethasone (EP0575484). U.S. Pat. No. 5,981,606 discloses a wound healing agent comprising TGF-beta and U.S. Pat. No. 6,800,286 and U.S. Pat. No. 5,155,214 disclose wound healing agents comprising FGF. All already described healing agents are growth factors, cytokines or chemokines, collagen or hyaluronic acid. These agents present the drawback of inducing adverse events as they are not specific of one cellular type. There is still a need for an alternative wound healing agent that results in an effective and rapid wound healing and does not cause adverse events. The present invention aims to provide new peptides as alternative wound healing agent, said peptides being specific of the angiogenesis mediated by endothelial cells. Peptides, which are less than 50 amino acids, present the advantage of being an interesting tool for therapeutic use due to their small size: they offer a high affinity-specificity to their target and low toxicity profiles, a room temperature storage and better tissue penetration owing to their smaller size. Moreover, they present the advantage of being easily synthesized compared to a full-length protein: their industrial production is thus more standardized and controlled. They do not show viral security issue as they can be chemically synthesized and do not suffer from refolding issues, glycosylation issues and activity variability.
/* eslint-disable @typescript-eslint/camelcase */ import * as Contentful from 'contentful'; import { TypeTestimonial } from 'lib/types'; import { Background } from 'components/Section/background'; import { Cta } from 'components/cta'; import { isRichText, renderRichText } from 'lib/rich-text'; export const Testimonial = ({ fields }: TypeTestimonial) => { const { title, backgroundColor, showLogo, quotes, logosNew, ctaType, ctaLabel, ctaTarget } = fields; return ( <Background {...{ background: backgroundColor, image: false as Contentful.EntryFields.Boolean, imageOverlay: false as Contentful.EntryFields.Boolean, }}> <div className="w-full flex flex-col grid justify-items-center"> <div className="w-full grid justify-items-center"> <h2 className="h0 pt-4 text-3xl font-medium leading-tight text-gray-900">{title}</h2> </div> <div className="pt-4 flex flex-row flex-nowrap justify-start overflow-x-scroll"> {quotes && quotes.map(function (quote, idx) { const textComp = isRichText(quote.fields.quote) ? renderRichText(quote.fields.quote) : quote.fields.quote; if (showLogo) { return ( <div key={idx} className="w-full flex-shrink-0 md:flex md:flex-row md:p-8 items-center"> <div className="md:w-2/5"> {showLogo && <img src={quote.fields.illustrationNew.fields.file.url} />} </div> <div className="md:w-3/5 md:pl-8 "> <div>{textComp}</div> <div className="font-bold pt-4">{quote.fields.author}</div> </div> </div> ); } else { return ( <div key={idx} className="w-full flex-shrink-0 md:flex md:flex-row md:p-8 items-center"> <div className="md:full md:pl-8 "> <div>{textComp}</div> <div className="font-bold pt-4">{quote.fields.author}</div> </div> </div> ); } })} </div> <div className="pt-4 flex flex-wrap items-center"> {logosNew && logosNew.map(function (logo, idx) { return ( <div key={idx} className="flex-shrink-0 p-4"> <img src={logo.fields.file.url} className="w-28" /> </div> ); })} </div> <div className="flex w-full justify-center pt-8 pb-8"> <Cta {...{ ctaType, ctaLabel, ctaTarget }} /> </div> </div> </Background> ); };
package ru.mk.pump.web.elements; import lombok.extern.slf4j.Slf4j; import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.Test; import org.openqa.selenium.By; import ru.mk.pump.commons.helpers.Parameter; import ru.mk.pump.commons.helpers.Parameters; import ru.mk.pump.commons.utils.Str; import ru.mk.pump.web.browsers.api.Browser; import ru.mk.pump.web.elements.api.concrete.Button; import ru.mk.pump.web.elements.api.concrete.DropDown; import ru.mk.pump.web.elements.api.concrete.TextArea; import ru.mk.pump.web.elements.internal.BaseElement; import static org.assertj.core.api.Assertions.assertThat; import static org.mockito.Mockito.mock; @Slf4j class ElementFactoryTest { private ElementFactory elementFactory; private Browser browser = mock(Browser.class); @BeforeEach void setUp() { elementFactory = new ElementFactory(new ElementImplDispatcher(), browser); } @Test void withMainRequirements() { } @Test void addActionListener() { } @Test void withActionListener() { } @Test void newElement() { final Parameters param = Parameters.of(Parameter.of("дополнительные xpath", By.class, "xpath")); final ElementConfig config = ElementConfig.of("Тестовый элемент", "Для юнит теста") .withParameters(param); final By by = By.tagName("div"); log.info(Str.toPrettyString(elementFactory.getInfo())); final Button button = elementFactory.newElement(Button.class, by, config); assertThat(button).isNotNull(); assertThat(((BaseElement) button).getParams()).isEqualTo(param); assertThat(button.info().getName()).isEqualTo("Тестовый элемент"); assertThat(button.info().getDescription()).isEqualTo("Для юнит теста"); assertThat(button.info().getName()).isEqualTo("Тестовый элемент"); final DropDown dropDown = elementFactory.newElement(DropDown.class, by, config); assertThat(dropDown).isNotNull(); assertThat(((BaseElement) dropDown).getParams()).isEqualTo(param); assertThat(dropDown.info().getName()).isEqualTo("Тестовый элемент"); assertThat(dropDown.info().getDescription()).isEqualTo("Для юнит теста"); assertThat(dropDown.info().getName()).isEqualTo("Тестовый элемент"); assertThat(dropDown).isNotNull(); } @Test void newElementList() { final Parameters param = Parameters.of(Parameter.of("дополнительные xpath", By.class, "xpath"), Parameter.of("еще параметр", "строка")); final ElementConfig configParent = ElementConfig.of("Тестовый элемент", "Для юнит теста") .withParameters(param); final ElementConfig configChild = ElementConfig.of("Тестовый элемент", "Для юнит теста") .withParameters(param); final By byParent = By.tagName("section"); final By byChild = By.xpath(".//button[@data-aid='startRegistration']"); final TextArea parent = elementFactory.newElement(TextArea.class, byParent, configParent.withIndex(1)); assertThat(parent).isInstanceOf(TextArea.class); assertThat(((BaseElement) parent).getParams()).isEqualTo(param); assertThat(parent.info().getName()).isEqualTo("Тестовый элемент"); assertThat(parent.info().getDescription()).isEqualTo("Для юнит теста"); assertThat(parent.info().getName()).isEqualTo("Тестовый элемент"); assertThat(parent.advanced().isList()).isTrue(); assertThat(parent.advanced().getIndex()).isEqualTo(1); final Button button = elementFactory.newElement(Button.class, byChild, parent, configChild); assertThat(button).isInstanceOf(Button.class); assertThat(((BaseElement) button).getParams()).isEqualTo(param); assertThat(button.info().getName()).isEqualTo("Тестовый элемент"); assertThat(button.info().getDescription()).isEqualTo("Для юнит теста"); assertThat(button.info().getName()).isEqualTo("Тестовый элемент"); assertThat(button.advanced().isList()).isFalse(); assertThat(button.advanced().getIndex()).isEqualTo(-1); } }
A primary cutaneous adenoid-cystic carcinoma in a young woman. Differential diagnosis and clinical implications Primary cutaneous adenoid-cystic carcinoma (PCACC) is a rare slow-growing neoplasm of disputed histogenesis characterized by a cribriform pattern at histology and local aggressive behaviour. Up to date about 60 cases of PCACC have been reported in the literature. This tumour is most common in the scalp, affects middle-aged and older individuals (mean age 59) and has predilection for women. We describe an unexpected case of PCACC in a 32-years-old woman referred to our clinic for a subcutaneous nodule in the scalp showing a slow growth and indolent course. The differential diagnosis and the clinical management of this PCACC patient, successfully treated with a wide local excision, are presented and discussed. Introduction Primary cutaneous adenoid cystic carcinoma (PCACC) is a rare tumour that affects middle aged and older individuals. It can occur in the scalp (32-41%), chest and abdomen and is characterized by indolent course and local aggressive behaviour. The average duration of the tumour prior to diagnosis is about 10 years. Patients typically present with slowly expanding skin coloured nodules from 0.5 to 9 cm in size. If the scalp is involved, alopecia is generally an associated finding. 3 In the past PCACC has been regarded as an eccrine lesion, but the possible origin from apocrine glands has been also proposed. 2 Case Report A 32 years old woman presented with a subcutaneous nodule of the scalp with associated alopecia (Figure 1). Three years prior to the presentation she noted an indolent, slowly expanding firm nodular lesion clinically resembling a cutaneous cylindroma or trichoblastoma. In her medical history there was no evidence of previous skin lesions or systemic tumour diseases. At physical examination no signs of systemic disease were found and superficial lymph nodes were not palpable. Haematological tests were also in the normal range. The lesion was surgically excised and submitted for histological examination. Macroscopically the lesion was a 1.9¥1.7¥2.3 cm grey-tan, flesh tumour, with poorly circumscribed borders. Microscopically it showed a characteristic basophilic appearance on low power magnification due to the nuclear hyperchromatism. The tumour was poorly circumscribed and consisted of medium-small basaloid cells, with scanty cytoplasm and small and inconspicuous nucleoli, arranged in islands, cords and strands with glandular, cystic, cribriform architecture, embedded in a fibrous stroma ( Figure 2 A and B). The lesion invaded the mid and deep dermis but not the subcutaneous fat. The basaloid cells were never displaced in a palisades fashion and no connection to the overlying epidermis was observed. Mitotic activity was very low (1 mitosis ¥ 10 high power field), whereas necrosis, perineural invasion, lymphatic and/or blood vessels infiltration were not present. At immunohistochemistry, as expected, the tumour expressed low and high molecular weight keratins. A variably expression of S-100 protein and carcino-embryonic antigen (CEA) was also observed. The latter was restricted to the luminal spaces (data not shown). To better analyze the tumour at phenotypic level, selected monoclonal antibodies to type IV collagen and epithelial membrane antigen (EMA) were also considered. The former highlighted the hyaline deposits among the basaloid cells and the latter the apical aspects of pseudoglandular areas ( Figure 2). The basophilic intraglandular material, variably detected in the tumour, was also highlighted with alcian blue staining at pH 2.5 and showed immunoreactivity for laminin (data not shown). Both histological and immuno-phenotypical evaluation suggested the diagnosis of adenoid cystic carcinoma (ACC), but considering the rarity of this entity as a primary skin tumour it was necessary to rule out the possibility of a skin metastasis arising from other malignancies with histological features of ACC. Adenoid cystic carcinoma, in fact, is most commonly seen as a neoplasm of the salivary glands and seromucinous glands of the upper respiratory tract. It has also been reported to occur in the breast, lung, uterine cervix, prostate, and lacrimal gland. 1,8 As a consequence a careful breast and an otolaryngology examination was performed at specialist level, followed by head and neck ultrasonographic scan. A PET-CT scan was also considered to complete the oncological screening. Fortunately, the clinical examination was negative and all the radiological exams resulted in the normal range. All together, the clinical, morphological and immuno-phenotypical features described above, supported the final diagnosis of a primary cutaneous adenoid cystic carcinoma (PCACC). To avoid local recurrences a wide surgical excision with 2 cm disease-free margins was finally performed. After 12 months of follow up the patient is fully recovered without any evidence of disease. Discussion Primary cutaneous adenoid cystic carcinoma is a rare skin tumour. It was first described by Boggio in the 1975 3 and, at the best of our knowledge, only 61 cases of PCACC have been reported in the literature. This lesion frequently arises in the scalp of middle-aged or elderly patient, with a slight predilection for women with an average age of 59 years. 1 In about 76% of the cases perineural invasion may be observed and this feature seems to have a prognostic value, with a double increase of relative-risk of local recurrences when it is detected (46% vs. 22%). 1 In the case reported herein the patient is surprisingly young (32 years), with only two youngest patients (14 years) registered in the literature. 1 The lesion didn't show images of perineural invasion nor lymphatic or blood vessels infiltration (several histological preparations were examined). As aforementioned adenoid cystic carcinoma is relatively frequent in the salivary glands and in this contest it can be an aggressive tumour. ACC has been reported to occur also in the breast, lung, vulva, cervix and prostate. 1,8 Considering the age of the patient and the rarity of this tumour in the skin it was imperative to rule out a skin metastasis from an ACC arising in other organs. This fact has important clinical implications because the occurrence of a skin metastasis from a "clinically occult ACC" requires a more complex oncological approach for advanced tumour disease. In our case the negative PET-CT scan and the careful clinical evaluation (in particular breast and salivary glands) allowed the diagnosis of PCACC. As aforementioned PCACC is characterized by a less aggressive behaviour compared to the analogous tumour arising in the salivary gland. The occurrence of lung and lymph nodes metastasis, in fact, is exceptionally rare in PCACC, 7-10 but, although infrequently, they can be found in ACCs arising from salivary gland and breast. Moreover perineural invasion, a hallmark feature of the salivary gland ACCs, may be absent in PCACC, as in the case described here. 8 An indolent course is the major feature of PCACC with a high tendency of local recurrence. 1,8 For this reason the recommended treatment for PCACC is a wide surgical excision with at least 2 cm tumour free margins. Differential diagnosis When a skin metastasis from an ACC arising in other organs is definitively ruled out, the differential diagnosis should include sev-eral primary skin tumours: the adenoid-cystic variant of basal cell carcinoma (BCC), mucinous carcinoma (MC) dermal cylindroma (DC) and the primary cutaneous cribriform apocrine carcinoma (PCCAC). All the mentioned lesions closely mimic the PCACC at histology. 8, The lack of basalioid cells disposed in peripheral palisades in an adenoid-cystic lesion without connection to the overlying epidermis help the distinction between a PCACC and the more common BCC adenoid-cystic type. 8 An other helpful histological feature for this differential diagnosis is the presence of artefactual clefts between tumour cells and stroma in BCC. 8 MC can be distinguished from a PCACC by histochemistry; in fact the former differs from PCACC in the produced sialomucins. 8 In the DC the tumour lobules contain nodular deposits of hyaline-pink material. Furthermore the lobules of cylindroma show a mosaic or jigsaw pattern, are surrounded by hyaline material and present peripheral small cells with scanty cytoplasm and dark nuclei. 8 Spiradenoma and spiroadenocylindroma can also have an adenoid cystic like pattern, as reported by Petersson and colleagues. 8,13 However, in these tumours this feature is always focal (maximum of 2.5 mm). For this reason a single punch biopsy of such lesions may be not representative for a definitive histological diagnosis. Another findings that lacks in PCACC is the lobular architecture of spiradenoma. 8 Primary cutaneous cribriform apocrine carcinoma (PCCAC) shows a diffuse cribriform pattern in the entire neoplasia, interconnections of the tumour cell aggregates, nuclear polymorphism and authentic elongated nuclei, features that are absent in the PCACC. 12 Opposite to PCCAC, PCACC presents uniform size and shape spaces between tumour cell aggregates and deposits of basement membrane material. 8 Finally, the neurotropism, which may be found in PCACC, has been never observed in PCCAC. 1,8 The histogenesis of PCACC is still uncertain and its apocrine or an eccrine derivation is debated. 14 Concluding, the prompt recognition of a PCACC is important. Integration of the clinical and histopathological data is essential for a correct diagnosis. This will avoid both patients distress and unnecessary over treatment.
<reponame>myevan/SilvervineUE4Lua // SilvervineUE4Lua / devCAT studio // Copyright 2016 - 2019. Nexon Korea Corporation. All rights reserved. #include "LuaUnitTests.h" #include "Kismet/KismetSystemLibrary.h" #include "LuaVirtualMachine.h" //============================================================================================================================= // FSUE4LuaTestCaseUE4 //============================================================================================================================= IMPLEMENT_SIMPLE_AUTOMATION_TEST(FSUE4LuaTestCaseUE4, "Silvervine.UE4Lua.UE4", SUE4LUA_ATF_UNITTEST) bool FSUE4LuaTestCaseUE4::RunTest(const FString& Parameters) { auto VM = FSUE4LuaVirtualMachine::Create(); if (!VM.IsValid()) { return false; } auto TestUObject = NewObject<USUE4LuaTestUObject>(); { VM->ExecuteString( TEXT("\n function Test(uobj)") TEXT("\n return UE4.IsValid(uobj)") TEXT("\n end")); TestTrue(TEXT("UE4.IsValid() == true"), FSUE4LuaFunction::CallGlobal<bool>(VM.ToSharedRef(), TEXT("Test"), TestUObject)); } { VM->ExecuteString( TEXT("\n function Test()") TEXT("\n return UE4.IsValid(nil)") TEXT("\n end")); TestTrue(TEXT("UE4.IsValid() == false"), !FSUE4LuaFunction::CallGlobal<bool>(VM.ToSharedRef(), TEXT("Test"))); } { VM->ExecuteString( TEXT("\n function Test()") TEXT("\n return UE4.FindClass('SUE4LuaTestUObject') ~= nil") TEXT("\n end")); TestTrue(TEXT("UE.FindClass()"), FSUE4LuaFunction::CallGlobal<bool>(VM.ToSharedRef(), TEXT("Test"))); } { VM->ExecuteString( TEXT("\n local UKismetSystemLibrary = UE4.FindClass('KismetSystemLibrary')") TEXT("\n function Test(uobj) return UKismetSystemLibrary.GetObjectName(uobj) end")); auto Arg = NewObject<USUE4LuaTestUObject>(); TestTrue(TEXT("UKismetSystemLibrary.GetObjectName()"), FSUE4LuaFunction::CallGlobal<FString>(VM.ToSharedRef(), TEXT("Test"), Arg) == UKismetSystemLibrary::GetObjectName(Arg)); } return true; }
import { Body, Controller, Get, Post } from '@nestjs/common'; import { AppService } from './app.service'; import { AuthHttpService } from './auth/http.service'; import { createUserResponseDto } from './user/dto/createResponse.dto'; import { CreateUserDto } from './user/dto/createUser.dto'; @Controller() export class AppController { constructor( private readonly appService: AppService, private readonly authHttpService: AuthHttpService, ) { } @Get() getHello(): string { return this.appService.getHello(); } @Post('/register') async register(@Body() body: CreateUserDto): Promise<createUserResponseDto> { let response = await this.authHttpService.axios.post('/user/register', body); return response.data; } }
Robert Pattinson Up for Daredevil? After the news surfaced that David Slade is directing the Untitled Daredevil Reboot, talk has now turned to who will star as Matt Murdock. There is speculation that Robert Pattinson may be in talks for the lead role. David Slade directed Robert Pattinson in The Twilight Saga: Eclipse last year, and it seems there is "some kind of talk" about Robert Pattinson reuniting with the director for Daredevil. Robert Pattinson is currently shooting his last movie as the vampire Edward Cullen, The Twilight Saga: Breaking Dawn - Part 2. The Untitled Daredevil Reboot is said to be a "close cousin" to other superhero reboots. Here's what an inside source recently had to say about David Slade's pitch for the reboot. "It's a bit (like) Batman Begins. The bad guy will learn who Daredevil really is and tries to destroy him - but not via the usual methods." Of course, none of this has been confirmed by the studio as of yet. The Untitled Daredevil Reboot will certainly be a hot commodity and we'll have to wait and see what actors start to circle the project.
<gh_stars>1-10 package com.tu.FinancialQuickCheck.dto; import com.tu.FinancialQuickCheck.db.ProjectUserEntity; import java.util.ArrayList; import java.util.List; public class ListOfProjectUserDto { public List<ProjectUserDto> projectUsers; public ListOfProjectUserDto(Iterable<ProjectUserEntity> projectUserEntities){ this.projectUsers = new ArrayList<>(); for (ProjectUserEntity tmp : projectUserEntities) { this.projectUsers.add(new ProjectUserDto(tmp)); } } }
package io.eventuate.tram.sagas.common; import org.junit.Assert; import org.junit.Test; public abstract class SagaLockManagerImplSchemaTest { @Test public void testInsertIntoSagaLockTable() { Assert.assertEquals(getExpectedInsertIntoSagaLockTable(), getSagaLockManagerSql().getInsertIntoSagaLockTableSql()); } @Test public void testInsertIntoSagaStashTable() { Assert.assertEquals(getExpectedInsertIntoSagaStashTable(), getSagaLockManagerSql().getInsertIntoSagaStashTableSql()); } @Test public void testSelectFromSagaLockTable() { Assert.assertEquals(getExpectedSelectFromSagaLockTable(), getSagaLockManagerSql().getSelectFromSagaLockTableSql()); } @Test public void testSelectFromSagaStashTable() { Assert.assertEquals(getExpectedSelectFromSagaStashTable(), getSagaLockManagerSql().getSelectFromSagaStashTableSql()); } @Test public void testUpdateSagaLockTable() { Assert.assertEquals(getExpectedUpdateSagaLockTable(), getSagaLockManagerSql().getUpdateSagaLockTableSql()); } @Test public void testDeleteFromSagaLockTable() { Assert.assertEquals(getExpectedDeleteFromSagaLockTable(), getSagaLockManagerSql().getDeleteFromSagaLockTableSql()); } @Test public void testDeleteFromSagaStashTable() { Assert.assertEquals(getExpectedDeleteFromSagaStashTable(), getSagaLockManagerSql().getDeleteFromSagaStashTableSql()); } protected abstract SagaLockManagerSql getSagaLockManagerSql(); protected abstract String getExpectedInsertIntoSagaLockTable(); protected abstract String getExpectedInsertIntoSagaStashTable(); protected abstract String getExpectedSelectFromSagaLockTable(); protected abstract String getExpectedSelectFromSagaStashTable(); protected abstract String getExpectedUpdateSagaLockTable(); protected abstract String getExpectedDeleteFromSagaStashTable(); protected abstract String getExpectedDeleteFromSagaLockTable(); }
<filename>basex-core/src/main/java/org/basex/util/options/EnumOption.java package org.basex.util.options; /** * Option containing an enumeration value. * * @author BaseX Team 2005-16, BSD License * @author <NAME> * @param <V> enumeration value */ public final class EnumOption<V extends Enum<V>> extends Option<V> { /** Class. */ private final Class<V> clazz; /** Default value. */ private final V value; /** * Constructor. * @param name name * @param value value */ @SuppressWarnings("unchecked") public EnumOption(final String name, final V value) { super(name); this.value = value; clazz = (Class<V>) value.getClass(); } /** * Constructor. * @param name name * @param clazz class */ public EnumOption(final String name, final Class<V> clazz) { super(name); this.clazz = clazz; value = null; } @Override public V value() { return value; } /** * Returns an enumeration value for the specified string or {@code null}. * @param string value * @return enumeration */ public V get(final String string) { for(final V v : values()) if(v.toString().equals(string)) return v; return null; } /** * Returns all enumeration values. * @return enumeration */ public V[] values() { return clazz.getEnumConstants(); } }
Comparison of C-reactive protein in normotensive & hypertensive type 2 diabetic patients 1,2,4 Department of Biochemistry, 3 Department of Gen. Medicine National Institute of Medical Sciences and Research, Jaipur *Corresponding Author Dr Suresh Babu Kondaveeti Associate Professor, Department of Biochemistry, National Institute of Medical Science & Research, NIMS University, Jaipur, Rajasthan Abstract Background: The present study was aimed to comparison the C-reactive protein with microalbuminuria among type 2 diabetic subjects in hypertension. Introduction: CRP production is part of the nonspecific acute-phase response to most forms of inflammation, infection and tissue damage. The association between CRP and hypertension could be related in part as follows: correlation between elevated CRP and arterial stiffness; association between CRP and metabolic syndrome, one of whose criteria is hypertension and the possibility that CRP may directly contribute to reduced nitric oxide synthesis in endothelial cells, leading to increased vascular resistance. Material & Methods: The present study includes of 500 type 2 diabetic subject & divided into two groups based on hypertension. Biochemical parameter such as C-reactive protein (CRP) was analysed in the department of biochemistry of NIMS medical college. Hypertension was taken according to definition of WHO; as systolic blood pressure (SBP) ≥140 mmHg and diastolic (DBP) ≥90 mmHg. Result: There was significant difference between the patient and control groups with regard to age, blood pressure, lipid profile, and CRP. The study also showed significant positive correlations between CRP, age and blood pressure. Conclusion: The levels of CRP were elevated in hypertensive individuals, which suggests the possibility of an inflammatory pathogenesis in hypertension. Introduction Hypertension is one of the health concern because it is a major risk factor for a number of cardiovascular diseases including stroke, atherosclerosis, type II diabetes, coronary heart disease, and renal disease. It affects 26% of adults worldwide, and its prevalence is predicted to increase to 29% by 2025. Data from more than 30 epidemiologic studies have shown a significant association between elevated serum concentrations of C-reactive protein (CRP) and the prevalence of underlying atherosclerosis, the incidence of first cardiovascular event in individuals at risk for atherosclerosis, and the risk http://jmscr.igmpublication.org/home/ ISSN (e)-2347-176x ISSN (p) 2455-0450 DOI: https://dx.doi.org/10.18535/jmscr/v7i9.131 of recurrent cardiovascular events among patients with established diseases. In recent years, the role of the inflammatory process in the pathogenesis of hypertension has been suggested. Consequently, the relationship between CRP and Hypertension can be evaluated.In recent years, the term CRP has been used widely. One common misunderstanding has been the incorrect belief that CRP is different in some way from CRP. The fact is that CRP only denotes the utilization of an assay designed to measure very low levels of CRP, that is, the so-called low grade inflammation. Low grade and acute inflammation states differ from each other in several ways. For instance, the latter occurs in response to infection and tissue injury and the former is induced in response to metabolic stress. In one study, it was suggested that lowgrade inflammation causes endothelial dysfunction and impaired nitric oxide availability, leading to an increased production of oxidative stress. Moreover, the relationship between this form of inflammation and obesity, a major risk factor of hypertension has been evaluated before. The aim of our work was to compare the level of CRP in hypertensive and normotensive individuals. Materials and Methods The present study includes 500 type 2 diabetes mellitus patients with microalbuminuria coming to NIMS Medical College & Hospital, Jaipur were considered for the study. Based on blood pressure, they were divided into two groups. 250 normotensive patients were considered in Group A and 250 hypertensive patients were considered in group B. 3) Pre-existing kidney/ prostatic disease. 4) Congestive heart failure. 5) Pregnancy. 6) Receiving any hypolipidaemic drugs. Fig:-1 Age wise distribution of patients In this study the minimum age was 26 years old and maximum age was 65 years old. Out of total 500 patients, 58% were males while 42% were females. The Fig-1 shows that in normotensive diabetic group (Group A) patients were maximum in age group >55 (38%) whereas in hypertensive diabetic group (Group B) maximum number of patients (36%) were in the age group of 46-55 years. On the other hand, there was no correlation found between CRP or UACR with diastolic blood pressure in both the groups. The above mentioned data shows that systolic blood pressure is strongly correlated with CRP and UACR while diastolic blood pressure is not our finding is supported by Lakoski et al who conducted a study in 6814 men and women ages 45 to 84 years old recruited in six U.S. communities and they concluded that systolic BP and pulse pressure, but not diastolic pressure, were associated with CRP (p=0.0001,0.0001 & 0.5 respectively). Similar study done by schillaci et al in 135 newly diagnosed, never treated patients with hypertension and 40 healthy matched non-hypertensive controls concluded that among hypertensive patients, plasma CRP was related to 24-h systolic blood pressure (r=0.28, p<0.01) and pulse pressure (r=0.32, p<0.01), but not to diastolic blood pressure (r=0.12, p<0.2). Similar results were found in other studies done by stuveling et al, Tsioufis et al and Nakamura et al. Stuveling et al had also shown that the association between microalbuminuria and CRP was more significant (p0.001) in subject with high mean arterial pressure. The concluded that BP positively modified the relationship between micrialbuminuria and CRP. Tsioufis et al had shown that microalbuminuria is accompanied by increase in CRP level in the setting of hypertension, reflecting a diffuse atherosclerotic process. So, the finding in the present study are consistent with the above mentioned study results. Conclusion In conclusion our results suggest that increased serum CRP levels associated with hypertension more significantly with type 2 diabetic case. Thus estimation of CRP can be a potential tool for individuals at the risk in development of hypertension and eventually cardiovascular disease in type 2 diabetes.
Experimental investigations of nanosecond-pulsed Nd:YAG laser beam micromachining on 304 stainless steel Abstract The demand for miniaturized components is increasing day by day as their application varies from industry to industry such as biomedical, micro-electro-mechanical system and aerospace. In the present research work, high-quality micro-channels are fabricated on 304 stainless steel by laser beam micromachining process with nanosecond Nd:YAG laser. The laser pulse energy (LPE), scanning speed (SS) and scanning pass number (SP No.) are used as the process parameters, whereas the depth and width of the kerf as well as the surface roughness are used to characterize the micro-channels. It is found that the kerf depth, width and surface roughness decrease with increase in the SS. The kerf depth sharply increases with increase in the SP No. The kerf width is minimum at 30 mJ LPE, 400 m s1 SS and 10 SP No. The minimum surface roughness is observed at 30 mJ LPE, 500 m s1 SS and 10 SP No. The oxygen content is found to gradually decrease with the distance from the centre of the micro-channel. Based on the experimental results, optimized input parameters can be offered to control the micro-channel dimensions and improve their surface finish effectively on stainless steel.
BATON ROUGE, LA (WAFB) - Southern quarterback Ladarius Skelton has been named the SWAC Offensive Player of the Week after a dominating performance Saturday against Prairie View. The sophomore was 15-of-26 passing for 168 yards and a touchdown in his first career start. He also ran the ball 27 times for 202 yard and three touchdowns. In total, Skelton accounted for 370 yards of offense and four touchdowns. The Jaguars beat the Panthers 38-0, improving to 3-3 overall and 2-1 in conference play.