content
stringlengths 7
2.61M
|
---|
Adaptive neural Preisach model and model predictive control of Shape Memory Alloy actuators Shape memory alloy (SMA) actuators are potential alternatives to conventional actuators in many surgical, aeronautic and biorobotic applications. However, their highly nonlinear hysteretic behavior between temperature and strain makes them difficult to use in real-time applications. The Preisach model is a well-known phenomenological method to accurately model the hysteresis of many physical system. In this paper, the numerical Preisach model is modified such that it can adapt to changes in the operating conditions, and can easily be used in real-time for controlling the strain in the SMA actuators. For this purpose, both the first-order descending and ascending curves are used and each of these curves is approximated by an artificial neural network (ANN). Weights of the ANNs are then updated online using the extended Kalman filter algorithm. To control the strain in the SMA actuator, a model predictive controller is implemented that uses the temperature dynamics and the adaptive modified Preisach model to calculate the optimal control signal using the Levenberg-Marquardt algorithm. Performance of the proposed model and the control scheme are validated using an experimental setup. |
Pseudoinvariant feature selection for cross-sensor optical satellite images Abstract. Processing of multitemporal satellite images generally suffers from uncertainties caused by differences in illumination and observation angles, as well as variation in atmospheric conditions. Moreover, satellite images acquired from different sensors contain not only the uncertainties but also disparate relative spectral response. Given that radiometric calibration and correction of satellite images are difficult without ground measurements during data acquisition, this study addresses pseudoinvariant feature selection for relative radiometric normalization (RRN) that minimizes the radiometric differences among images caused by atmospheric and spectral band inconsistencies during data acquisition. The key to a successful RRN is the selection of pseudoinvariant features (PIFs) among bitemporal images. To select PIFs, multivariate alteration detection (MAD) algorithm is adopted with kernel canonical correlation analysis (KCCA) instead of canonical correlation analysis (CCA). KCCA, which assumes that the relation between at-sensor radiance is spatially nonlinear, can obtain more appropriate PIFs for cross-sensor images than that of CCA, which assumes that the relation between the at-sensor radiances of bitemporal image is spatially linear. In addition, a regularization term is added to the optimization of KCCA to avoid trivial solutions and overfitting. Qualitative and quantitative analyses on bitemporal images acquired by Landsat-7 enhanced thematic mapper plus and Landsat-8 operational and imager sensors were conducted to evaluate the proposed method. The experimental results demonstrate the superiority of the proposed KCCA-based MAD to the CCA-based MAD in terms of PIF selection, particularly for images containing significant cloud covers. |
"""tests for colorize
"""
import json
import os
from unittest.mock import patch
from ansible_navigator.ui_framework.colorize import Colorize
from ansible_navigator.yaml import human_dump
SHARE_DIR = os.path.abspath(
os.path.join(os.path.basename(__file__), "..", "share", "ansible_navigator")
)
def test_basic_success_json():
"""Ensure the json string is returned as 1 lines, 5 parts and can be reassembled
to the json string"""
sample = json.dumps({"test": "data"})
result = Colorize(SHARE_DIR).render(doc=sample, scope="source.json")
assert len(result) == 1
assert len(result[0]) == 5
assert "".join(line_part["chars"] for line_part in result[0]) == sample
def test_basic_success_yaml():
"""Ensure the yaml string is returned as 2 lines, with 1 and 3 parts
respectively, ensure the parts of the second line can be reaseembled to
the second line of the yaml string
"""
sample = human_dump({"test": "data"})
result = Colorize(SHARE_DIR).render(doc=sample, scope="source.yaml")
assert len(result) == 2
assert len(result[0]) == 1
assert result[0][0]["chars"] == sample.splitlines()[0]
assert len(result[1]) == 3
assert "".join(line_part["chars"] for line_part in result[1]) == sample.splitlines()[1]
@patch("ansible_navigator.ui_framework.colorize.tokenize")
def test_graceful_failure(mocked_func, caplog):
"""Ensure a tokenization error returns the original one line json string
w/o color and the log reflects the critical error
"""
mocked_func.side_effect = ValueError()
sample = json.dumps({"test": "data"})
result = Colorize(SHARE_DIR).render(doc=sample, scope="source.json")
assert len(result) == 1
assert len(result[0]) == 1
assert result[0][0]["chars"] == sample
assert result[0][0]["color"] is None
assert result[0][0]["column"] == 0
assert "rendered without color" in caplog.text
|
import { Component, OnInit } from '@angular/core';
import { HttpClient, HttpHeaders } from '@angular/common/http';
import { Router, ActivatedRoute } from '@angular/router';
import { AuthBanana } from '../../../utils/auth';
import { tokenUtil } from '../../../utils/tokenUtil';
import { notifyManage, showNotification } from '../../../utils/notifyUtil';
import { BananaConstants } from '../../../utils/constants';
import { Contact } from '../../../models/contact';
import {Md5} from 'ts-md5/dist/md5';
@Component({
selector: 'app-users-crud',
templateUrl: './users-crud.component.html',
styleUrls: ['./users-crud.component.css']
})
export class UsersCrudComponent implements OnInit {
email: string;
titleUser = 'Editar Usuario';
user: any = {};
typeView:number;
permissions: any[];
selectedPermission: any[] = [];
contact: Contact = new Contact();
ConfirmPassword:string;
loading = false;
combo_select :any[];
constructor(public http: HttpClient, public router: Router, private _activeRoute: ActivatedRoute) { }
ngOnInit() {
AuthBanana(this.router)
this.getElements();
this._activeRoute.url.subscribe(url => {
if(url[2].path === 'edit'){
this.titleUser = 'Edit User';
this.typeView = 3;
this.email = this._activeRoute.snapshot.params['email'];
this.getUsers(this.email);
} else {
this.titleUser = 'Create User';
this.user.id = -1;
this.typeView = 1;
}
});
}
getUsers(email): void {
this.loading = true;
const headers = new HttpHeaders().set('authorization', window.location.origin)
.append('user', sessionStorage.getItem('user_id'))
.append('token', sessionStorage.getItem('user_token'))
.append('app', 'bananaCli');
const options = {
headers: headers,
};
this.http.get(BananaConstants.urlServer+'api/user/' + email, options).toPromise().then(
result => {
console.log('result.status', result);
const body :any = result;
this.user = body[0];
this.user.password = '';
this.contact = this.user.contact_id;
this.getPermits(2,this.user.id);
this.loading = false;
},
msg => {
if (msg.status == 406) {
tokenUtil(this.router);
}
this.loading = false;
notifyManage(msg);
}
);
}
updateUser(): void {
this.loading = true;
showNotification("Actualizando tercero", 2);
let body : any;
const headers = new HttpHeaders().set('authorization', window.location.origin)
.append('user', sessionStorage.getItem('user_id'))
.append('token', sessionStorage.getItem('user_token'))
.append('app', 'bananaCli');
const options = {
headers: headers,
};
const md5 = new Md5();
body = this.user;
this.http.put(BananaConstants.urlServer+'api/users/update', body, options).toPromise().then(
result => {
showNotification('guardado con exito', 1);
this.loading = false;
},
msg => {
if (msg.status == 406) {
tokenUtil(this.router);
}
this.loading = false;
notifyManage(msg);
}
);
}
createUser(): void {
this.loading = true;
showNotification('Actualizando tercero', 2);
let body : any;
const headers = new HttpHeaders().set('authorization', window.location.origin)
.append('user', sessionStorage.getItem('user_id'))
.append('token', sessionStorage.getItem('user_token'))
.append('app', 'bananaCli');
const options = {
headers: headers,
};
body = this.user;
this.http.post(BananaConstants.urlServer+'api/users/create', body, options).toPromise().then(
result => {
showNotification('guardado con exito', 1);
this.loading = false;
},
msg => {
if (msg.status == 406) {
tokenUtil(this.router);
}
this.loading = false;
notifyManage(msg);
}
);
}
getElements(): void {
this.loading = true;
const headers = new HttpHeaders().set('authorization', window.location.origin)
.append('user', sessionStorage.getItem('user_id'))
.append('token', sessionStorage.getItem('user_token'))
.append('app', 'bananaCli');
const options = {
headers: headers,
};
this.http.get(BananaConstants.urlServer+'api/users/elements', options).toPromise().then(
result => {
console.log('result.status', result);
const body :any = result;
this.combo_select = body.elements;
console.log(this.combo_select)
this.loading = false;
},
msg => {
if (msg.status == 406) {
tokenUtil(this.router);
}
this.loading = false;
notifyManage(msg);
}
);
}
getPermits(type,id): void {
this.loading = true;
const headers = new HttpHeaders().set('authorization', window.location.origin)
.append('user', sessionStorage.getItem('user_id'))
.append('token', sessionStorage.getItem('user_token'))
.append('app', 'bananaCli');
const options = {
headers: headers,
params:{
id:id,
type:type
}
};
this.http.get(BananaConstants.urlServer+'api/users/getPermits', options).toPromise().then(
result => {
// console.log('result.status', result);
const body:any = result;
this.permissions = body.permissions;
// console.log(this.permissions)
this.permissSelect();
this.loading = false;
},
msg => {
if (msg.status == 406) {
tokenUtil(this.router);
}
this.loading = false;
notifyManage(msg);
}
);
}
selectColumn(columnsPer, event, action){
let exist = false;
switch (action) {
case 'create':
columnsPer.create = (event) ? 1 : 0;
break;
case 'update':
columnsPer.update = (event) ? 1 : 0;
break;
case 'read':
columnsPer.read = (event) ? 1 : 0;
break;
case 'delete':
columnsPer.delete = (event) ? 1 : 0;
break;
}
for (let i = 0; i < this.selectedPermission.length; i++) {
if (this.selectedPermission[i].column_id == columnsPer.column_id){
this.selectedPermission[i] = columnsPer;
exist = true;
break;
}
}
if (!exist) {
this.selectedPermission.push( columnsPer );
}
}
permissSelect() {
for (let i = 0; i < this.permissions.length; i++) {
for (let j = 0; j < this.permissions[i].columns.length; j++) {
if (this.permissions[i].columns[j].selected == 1) {
this.selectedPermission.push( this.permissions[i].columns[j]);
}
}
}
}
}
|
// - removes first element of the block
E removeFirst() {
assert(size > 0);
@SuppressWarnings("unchecked")
E removed = (E) values[offset];
values[offset] = null;
offset = index(1);
size--;
return removed;
} |
<reponame>bebo/webrtc<filename>modules/congestion_controller/include/send_side_congestion_controller.h
/*
* Copyright (c) 2012 The WebRTC project authors. All Rights Reserved.
*
* Use of this source code is governed by a BSD-style license
* that can be found in the LICENSE file in the root of the source
* tree. An additional intellectual property rights grant can be found
* in the file PATENTS. All contributing project authors may
* be found in the AUTHORS file in the root of the source tree.
*/
#ifndef WEBRTC_MODULES_CONGESTION_CONTROLLER_INCLUDE_SEND_SIDE_CONGESTION_CONTROLLER_H_
#define WEBRTC_MODULES_CONGESTION_CONTROLLER_INCLUDE_SEND_SIDE_CONGESTION_CONTROLLER_H_
#include <memory>
#include <vector>
#include "webrtc/common_types.h"
#include "webrtc/modules/congestion_controller/delay_based_bwe.h"
#include "webrtc/modules/congestion_controller/transport_feedback_adapter.h"
#include "webrtc/modules/include/module.h"
#include "webrtc/modules/include/module_common_types.h"
#include "webrtc/modules/pacing/paced_sender.h"
#include "webrtc/modules/pacing/packet_router.h"
#include "webrtc/rtc_base/constructormagic.h"
#include "webrtc/rtc_base/criticalsection.h"
#include "webrtc/rtc_base/networkroute.h"
#include "webrtc/rtc_base/race_checker.h"
namespace rtc {
struct SentPacket;
}
namespace webrtc {
class BitrateController;
class Clock;
class AcknowledgedBitrateEstimator;
class ProbeController;
class RateLimiter;
class RtcEventLog;
class SendSideCongestionController : public CallStatsObserver,
public Module,
public TransportFeedbackObserver {
public:
// Observer class for bitrate changes announced due to change in bandwidth
// estimate or due to that the send pacer is full. Fraction loss and rtt is
// also part of this callback to allow the observer to optimize its settings
// for different types of network environments. The bitrate does not include
// packet headers and is measured in bits per second.
class Observer {
public:
virtual void OnNetworkChanged(uint32_t bitrate_bps,
uint8_t fraction_loss, // 0 - 255.
int64_t rtt_ms,
int64_t probing_interval_ms) = 0;
protected:
virtual ~Observer() {}
};
// TODO(nisse): Consider deleting the |observer| argument to constructors
// once CongestionController is deleted.
SendSideCongestionController(const Clock* clock,
Observer* observer,
RtcEventLog* event_log,
PacketRouter* packet_router);
SendSideCongestionController(const Clock* clock,
Observer* observer,
RtcEventLog* event_log,
std::unique_ptr<PacedSender> pacer);
~SendSideCongestionController() override;
void RegisterPacketFeedbackObserver(PacketFeedbackObserver* observer);
void DeRegisterPacketFeedbackObserver(PacketFeedbackObserver* observer);
// Currently, there can be at most one observer.
void RegisterNetworkObserver(Observer* observer);
void DeRegisterNetworkObserver(Observer* observer);
virtual void SetBweBitrates(int min_bitrate_bps,
int start_bitrate_bps,
int max_bitrate_bps);
// Resets the BWE state. Note the first argument is the bitrate_bps.
virtual void OnNetworkRouteChanged(const rtc::NetworkRoute& network_route,
int bitrate_bps,
int min_bitrate_bps,
int max_bitrate_bps);
virtual void SignalNetworkState(NetworkState state);
virtual void SetTransportOverhead(size_t transport_overhead_bytes_per_packet);
virtual BitrateController* GetBitrateController() const;
virtual int64_t GetPacerQueuingDelayMs() const;
virtual int64_t GetFirstPacketTimeMs() const;
// TODO(nisse): Delete this accessor function. The pacer should be
// internal to the congestion controller.
virtual PacedSender* pacer();
virtual TransportFeedbackObserver* GetTransportFeedbackObserver();
RateLimiter* GetRetransmissionRateLimiter();
void EnablePeriodicAlrProbing(bool enable);
// SetAllocatedSendBitrateLimits sets bitrates limits imposed by send codec
// settings.
// |min_send_bitrate_bps| is the total minimum send bitrate required by all
// sending streams. This is the minimum bitrate the PacedSender will use.
// Note that SendSideCongestionController::OnNetworkChanged can still be
// called with a lower bitrate estimate. |max_padding_bitrate_bps| is the max
// bitrate the send streams request for padding. This can be higher than the
// current network estimate and tells the PacedSender how much it should max
// pad unless there is real packets to send.
void SetAllocatedSendBitrateLimits(int min_send_bitrate_bps,
int max_padding_bitrate_bps);
virtual void OnSentPacket(const rtc::SentPacket& sent_packet);
// Implements CallStatsObserver.
void OnRttUpdate(int64_t avg_rtt_ms, int64_t max_rtt_ms) override;
// Implements Module.
int64_t TimeUntilNextProcess() override;
void Process() override;
// Implements TransportFeedbackObserver.
void AddPacket(uint32_t ssrc,
uint16_t sequence_number,
size_t length,
const PacedPacketInfo& pacing_info) override;
void OnTransportFeedback(const rtcp::TransportFeedback& feedback) override;
std::vector<PacketFeedback> GetTransportFeedbackVector() const override;
private:
void MaybeTriggerOnNetworkChanged();
bool IsSendQueueFull() const;
bool IsNetworkDown() const;
bool HasNetworkParametersToReportChanged(uint32_t bitrate_bps,
uint8_t fraction_loss,
int64_t rtt);
const Clock* const clock_;
rtc::CriticalSection observer_lock_;
Observer* observer_ GUARDED_BY(observer_lock_);
RtcEventLog* const event_log_;
const std::unique_ptr<PacedSender> pacer_;
const std::unique_ptr<BitrateController> bitrate_controller_;
std::unique_ptr<AcknowledgedBitrateEstimator> acknowledged_bitrate_estimator_;
const std::unique_ptr<ProbeController> probe_controller_;
const std::unique_ptr<RateLimiter> retransmission_rate_limiter_;
TransportFeedbackAdapter transport_feedback_adapter_;
rtc::CriticalSection network_state_lock_;
uint32_t last_reported_bitrate_bps_ GUARDED_BY(network_state_lock_);
uint8_t last_reported_fraction_loss_ GUARDED_BY(network_state_lock_);
int64_t last_reported_rtt_ GUARDED_BY(network_state_lock_);
NetworkState network_state_ GUARDED_BY(network_state_lock_);
rtc::CriticalSection bwe_lock_;
int min_bitrate_bps_ GUARDED_BY(bwe_lock_);
std::unique_ptr<DelayBasedBwe> delay_based_bwe_ GUARDED_BY(bwe_lock_);
bool was_in_alr_;
rtc::RaceChecker worker_race_;
RTC_DISALLOW_IMPLICIT_CONSTRUCTORS(SendSideCongestionController);
};
} // namespace webrtc
#endif // WEBRTC_MODULES_CONGESTION_CONTROLLER_INCLUDE_SEND_SIDE_CONGESTION_CONTROLLER_H_
|
Identification of knowledge translation opportunities in the treatment of locally advanced breast cancer: Results of a national survey of physicians. 6585 Background: Locally advanced breast cancer (LABC) accounts for only 10% of all breast cancers. While several guidelines and consensus statements exist, whether the current practice reflects these guidelines is unclear. We sought to survey the oncologists in Canada to assess current practice patterns and identify areas of targeted knowledge translation interventions (KTIs) in the treatment of LABC. Methods: 426 Canadian oncologists were surveyed with a 29 item survey-tool. They were subdivided into LABC experts (n=83) and non-experts (n=343). Physicians were removed from the survey if they identified that they were not involved in the treatment of breast cancer. The survey included demographic information as well as questions as to the current practice patterns utilized in the pathway of care for LABC patients. Level of discordance was calculated between the expert and non-expert responses using a z test. Results: 139 responses were obtained (48% response rate) from the non-experts and 51 responses we... |
// Declaration of functions:
// Definition of function updateAngles
void f_AngleReader_updateAngles(struct AngleReader_Instance *_instance, uint32_t inputSignal, uint8_t signalType, uint32_t L2delay, uint8_t calibrated) {
SignalType type = SignalType(signalType);
uint32_t signal = (type == SignalType::bh || type == SignalType::bv) ? inputSignal : inputSignal - L2delay;
double signalAngleMag = PI * ((double) signal)/PERIOD - PI/2;
double signalAngle = (type == SignalType::bh || type == SignalType::ch) ? -signalAngleMag : signalAngleMag;
if (type == SignalType::error) {
AngleReader_send_StatusSender_status(_instance, 15);
} else {
if (abs(signalAngle - lastSignalAngles[type]) < MAX_VALID_ANGLE_DIFFERENCE) {
signalAngles[type] = signalAngle;
}
lastSignalAngles[type] = signalAngle;
}
if (DEBUG && VERBOSE) {
if (calibrated) {
Serial.print("\nD,");
Serial.print(type);
Serial.print(",");
Serial.print(signalAngle,6);
Serial.print(",");
}
if (type == SignalType::bh) {
Serial.print("\nE,");
Serial.print(signalAngles[0],6);
Serial.print(",");
Serial.print(signalAngles[1],6);
Serial.print(",");
Serial.print(signalAngles[2],6);
Serial.print(",");
Serial.print(signalAngles[3],6);
Serial.print(",");
}
}
} |
/*
* Check the statistics for the requested parameter.
*/
static void
check_stats(struct VSM_data *vd, char *param)
{
int status;
struct stat_priv priv;
priv.found = 0;
priv.param = param;
#if defined(HAVE_VARNISHAPI_4) || defined(HAVE_VARNISHAPI_4_1)
(void)VSC_Iter(vd, NULL, check_stats_cb, &priv);
#elif defined(HAVE_VARNISHAPI_3)
(void)VSC_Iter(vd, check_stats_cb, &priv);
#endif
if (strcmp(param, "ratio") == 0) {
intmax_t total = priv.cache_hit + priv.cache_miss;
priv.value = total ? (100 * priv.cache_hit / total) : 0;
priv.info = "Cache hit ratio";
}
if (priv.found != 1) {
printf("Unknown parameter '%s'\n", param);
exit(1);
}
status = check_thresholds(priv.value);
printf("VARNISH %s: %s (%'jd)|%s=%jd\n", status_text[status],
priv.info, priv.value, param, priv.value);
exit(status);
} |
<gh_stars>1000+
/*
* Copyright (c) Facebook, Inc. and its affiliates.
*
* This source code is licensed under the MIT license found in the
* LICENSE file in the root directory of this source tree.
*/
int init_divide_by_zero() {
int t[2][3][2] = {{{1, 1}, {2, 2}, {3, 3}}, {{4, 4}, {5, 5}, {1, 0}}};
return t[0][1][0] / t[1][2][1];
}
|
Share. Opens doors and allows operation of photocopiers. Opens doors and allows operation of photocopiers.
A hi-tech office block in Sweden is trialling a scheme in which its tenants have microchips inserted into their hands in order to gain entry to the building.
As reported by the BBC, the radio-frequency identification (RFID) chips are provided by the Swedish Biohacking Group, and enable tenants in the Epicenter office block to open doors and even operate the photocopiers. Future services are expected to include the ability to pay at the office cafe without needing to get out a card.
The chips are about the size of a grain of rice and are inserted by a tattooist between the thumb and index finger, with pain apparently no greater than that of an injection. Though it's claimed that this will eventually make things easier, the BBC reports it's not always the case, and their reporter actually had to twist his arm into a rather unnatural position to get the photocopier to work.
At the moment it's a pilot scheme with only a few receiving the chip, though it'll apparently be opened up to the building's 700 other employees on an opt-in basis.
As for why this is being done, the Swedish Biohacking Group suggests it's in order to prepare us for the day our governments and corporations "come to us and say everyone should get chipped - the tax authority chip, the Google or Facebook chip."
Exit Theatre Mode
This isn't the first time we've seen something along these lines, with a US company previously having floated the idea of smart digital tattoos, though this is certainly far more invasive. What's more unsettling is the news the BBC reporter flew home with the chip still under his skin retaining all his contact details.
Is this something you could see the benefits of, or does the entire thing sound a bit too Big Brother for you? Sound off in the comments.
Luke Karmali is IGN's UK News Editor. You too can revel in mediocrity by following him on Twitter. |
def decode_domain_def(domains, merge=True, return_string=False):
if not domains:
return None, None
if domains[-1] == ",":
domains = domains[:-1]
x = domains
if return_string:
domain_fragments = [[r.strip() for r in ro.split(":")] for ro in x.split(",")]
else:
domain_fragments = [[int(r.strip()) for r in ro.split(":")] for ro in x.split(",")]
domain_merged = [domain_fragments[0][0], domain_fragments[-1][-1]]
if merge:
return domain_merged
else:
return domain_fragments |
A distributed adaptive scheme for multiagent systems In traditional adaptive control, the certainty equivalence principle suggests a twostep design scheme. A controller is first designed for the ideal situation assuming the uncertain parameter was known, and it renders a Lyapunov function. Then, the uncertain parameter in the controller is replaced by its estimation that is updated by an adaptive law along the gradient of Lyapunov function. This principle does not generally work for a multiagent system as an adaptive law based on the gradient of (centrally constructed) Lyapunov function cannot be implemented in a distributed fashion, except for limited situations. In this paper, we propose a novel distributed adaptive scheme, not relying on gradient of Lyapunov function, for general multiagent systems. In this scheme, asymptotic consensus of a secondorder uncertain multiagent system is achieved in a network of directed graph. |
A Motion Retargeting Method with Footstep Constraints In the field of animation and virtualization, there are many existing motion retargeting methods while those methods are not widely applied to practical production. When we transfer a human motion to an avatar in practice, the foot distortion is most obvious. In this paper we present a motion retargeting method for 3D human body with footstep constraints. With the foot end-effector constraints, we solve the footstep slip problem and keep the feet on the ground. To obtain more reasonable results, we also do some smoothing processing with constraints. In this paper we present our experimental results on real captured motion data. |
#!/usr/bin/env python
# Licensed to the StackStorm, Inc ('StackStorm') under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""
Visualize the links created by rules.
1. requires graphviz
pip install graphviz
apt-get install graphviz
To run :
./st2-analyze-links.py --action_ref <action-ref>
The command must run on a StackStorm box.
"""
import os
import sets
from oslo_config import cfg
from st2common import config
from st2common.util.monkey_patch import monkey_patch
from st2common.persistence.rule import Rule
from st2common.service_setup import db_setup
try:
from graphviz import Digraph
except ImportError:
msg = ('Missing "graphviz" dependency. You can install it using pip: \n'
'pip install graphviz')
raise ImportError(msg)
def do_register_cli_opts(opts, ignore_errors=False):
for opt in opts:
try:
cfg.CONF.register_cli_opt(opt)
except:
if not ignore_errors:
raise
class RuleLink(object):
def __init__(self, source_action_ref, rule_ref, dest_action_ref):
self._source_action_ref = source_action_ref
self._rule_ref = rule_ref
self._dest_action_ref = dest_action_ref
def __str__(self):
return '(%s -> %s -> %s)' % (self._source_action_ref, self._rule_ref, self._dest_action_ref)
class LinksAnalyzer(object):
def __init__(self):
self._rule_link_by_action_ref = {}
self._rules = {}
def analyze(self, root_action_ref, link_tigger_ref):
rules = Rule.query(trigger=link_tigger_ref, enabled=True)
# pprint.pprint([rule.ref for rule in rules])
for rule in rules:
source_action_ref = self._get_source_action_ref(rule)
if not source_action_ref:
print 'No source_action_ref for rule %s' % rule.ref
continue
rule_links = self._rules.get(source_action_ref, None)
if rule_links is None:
rule_links = []
self._rules[source_action_ref] = rule_links
rule_links.append(RuleLink(source_action_ref=source_action_ref, rule_ref=rule.ref,
dest_action_ref=rule.action.ref))
analyzed = self._do_analyze(action_ref=root_action_ref)
for (depth, rule_link) in analyzed:
print '%s%s' % (' ' * depth, rule_link)
return analyzed
def _get_source_action_ref(self, rule):
criteria = rule.criteria
source_action_ref = criteria.get('trigger.action_name', None)
if not source_action_ref:
source_action_ref = criteria.get('trigger.action_ref', None)
return source_action_ref['pattern'] if source_action_ref else None
def _do_analyze(self, action_ref, rule_links=None, processed=None, depth=0):
if processed is None:
processed = sets.Set()
if rule_links is None:
rule_links = []
processed.add(action_ref)
for rule_link in self._rules.get(action_ref, []):
rule_links.append((depth, rule_link))
if rule_link._dest_action_ref in processed:
continue
self._do_analyze(rule_link._dest_action_ref, rule_links=rule_links,
processed=processed, depth=depth + 1)
return rule_links
class Grapher(object):
def generate_graph(self, rule_links, out_file):
graph_label = 'Rule based visualizer'
graph_attr = {
'rankdir': 'TD',
'labelloc': 't',
'fontsize': '15',
'label': graph_label
}
node_attr = {}
dot = Digraph(comment='Rule based links visualization',
node_attr=node_attr, graph_attr=graph_attr, format='png')
nodes = sets.Set()
for _, rule_link in rule_links:
print rule_link._source_action_ref
if rule_link._source_action_ref not in nodes:
nodes.add(rule_link._source_action_ref)
dot.node(rule_link._source_action_ref, rule_link._source_action_ref)
if rule_link._dest_action_ref not in nodes:
nodes.add(rule_link._dest_action_ref)
dot.node(rule_link._dest_action_ref, rule_link._dest_action_ref)
dot.edge(rule_link._source_action_ref, rule_link._dest_action_ref, constraint='true',
label=rule_link._rule_ref)
output_path = os.path.join(os.getcwd(), out_file)
dot.format = 'png'
dot.render(output_path)
def main():
monkey_patch()
cli_opts = [
cfg.StrOpt('action_ref', default=None,
help='Root action to begin analysis.'),
cfg.StrOpt('link_trigger_ref', default='core.st2.generic.actiontrigger',
help='Root action to begin analysis.'),
cfg.StrOpt('out_file', default='pipeline')
]
do_register_cli_opts(cli_opts)
config.parse_args()
db_setup()
rule_links = LinksAnalyzer().analyze(cfg.CONF.action_ref, cfg.CONF.link_trigger_ref)
Grapher().generate_graph(rule_links, cfg.CONF.out_file)
if __name__ == '__main__':
main()
|
Jordan, a high school junior, was feeling the pressure to decide what career he wanted to pursue and where to apply to college. The avid athlete had a vague idea that he wanted to go into a medical field.
He went on Facebook and used the application Match Me 3. Based on his completion of an interest assessment, the app suggested that he consider medical-related fields that aligned with his interest in athletics, such as physical therapy.
After exploring various matching careers, he was able to compare schools with relevant programs of study and determine which schools might be right to pursue what interested him.
Mobile apps help job seekers stay on top of their job search, even on the run. Tolan highlights apps such as LinkedIn, for networking and staying in contact with co-workers and friends; Pocket Resume, which allows users to update and tailor their résumé on the go; and productivity apps including Evernote, to help job seekers keep track of their next steps.
Tolan’s own Spark Hire app allows job seekers to record one-way video interviews answering an employer’s written questions by video, letting candidates move the interview process forward virtually.
Another website worth considering is the job-matching site TheLadders, which initially was limited to individuals and jobs with salaries above $100,000. It’s now open to job seekers at all levels. All posted jobs are screened, and only full-time, salary-based positions are listed. Job seekers can narrow search results to meet their specific experience and desired criteria.
George Bradt, creator of the New Leader Smart Tools iPad app — which helps managers starting a new job or initiative put together a 100-day action plan to build their team and quickly start showing results — reminds us that apps are just tools. They merely assist you in doing what you need to do. Apps by themselves won’t make anyone successful. It is how you use the apps that matter. They “just make it easier for them to know about, plan, implement and follow through on things that will make them and their teams successful,” he says.
Another helpful app is Clipix, an online bookmarking tool that allows job hunters to create a private and personalized job search center. Clipix users can save, categorize and short-list the job opportunities from any website they want and then access them anywhere from a computer, iPhone or Android.
In addition to bookmarking links, the user can upload images, videos and documents to create a place to retrieve job search materials, no matter where you are.
BoardProspects is a social network designed to help individuals connect with advisory boards and board of directors for nonprofit groups as well as private and public companies. While most professionals think that board seats are only for people in the latter stages of their career or those with lots of money, there are many opportunities available for talented individuals to join boards.
Backed by NASDAQ, BoardProspects is aimed at building better boards and boardroom recruitment through social and professional online networks. It's essentially a LinkedIn for boardrooms.
CanWeNetwork is an interesting new app that will be launched next week. It is a geospatial business networking app. It allows members to locate people in the network, based on criteria including location, interests, past work experience and groups, and provides a means to connect with these individuals.
For example, say you were checking in for a flight to London. You could log on to the network and find out who on your plane you might like to meet.
The internet and mobile apps are essential career tools today. Whether it be gaining new skills through online learning, using the internet to aid your job search, building an online business network or using online tools to be more productive at work, there's an app for that.
Being familiar with the online tools that are available and learning how to use them will not only make you more effective as an employee, but also more desirable as a job candidate.
A veteran human resources executive, Lee E. Miller is a career coach and the author of "UP: Influence Power and the U Perspective — The Art of Getting What You Want." Mail questions to Lee@employability-expert.com. |
Comparative Study of Calculation Methods for Short-Circuit Currents in Low-Voltage Networks with Asynchronous Motors The most dangerous emergency short-time operation mode of electrical installations of auxiliary system of thermal power plants with asynchronous motors is the short circuit mode. One of the reasons for the specific and timely failure of protective devices, when a short circuit occurs in the auxiliary systems of power plants and substations, which did not determine the expression in the current method, is considered in the national standard GOST 28249-93 "Method for calculating short-circuit currents". According to the existing GOST 28249-93 methodology, asynchronous motors should be considered only at the initial stage of short-circuit, when they switch to generator mode and increase the short-circuit current. One of these assumptions is to take into account the effect of asynchronous motors on the short-circuit current if the rated asynchronous motors current exceeds 1.0 % of the initial value of the periodic component of the short-circuit current in auxiliary systems of power plants and substations with low-voltage asynchronous motors. The purpose of the research in this paper is improved methodology for accounting for the effect of asynchronous motors on short-circuit current in the current GOST 28249-93 methodology and comparison of this calculation with the international standard IEC 60909 and in the ETAP (Electrical Transient Analyzer Program) software package in order to assess their qualities in terms of capability in solving problems, differences in the results, and errors allowed in the calculations of short-circuit current, scope and convenience in applying. Research methods are considered by creation of a special calculation and mathematical model based on the national standard GOST 28249-93, international standard IEC 60909 and in the ETAP software package on the example of the current VIII-section system of CHPP-1 in Dushanbe, Republic of Tajikistan with the 9th asynchronous motors. |
// Command protoc-gen-doc is a Protobuf plugin for documentation generation.
//
// Documentation can be found inside:
//
// README.doc.md (https://github.com/sourcegraph/prototools/blob/master/README.doc.md)
//
// More information about Protobuf can be found at:
//
// https://developers.google.com/protocol-buffers/
//
package main // import "sourcegraph.com/sourcegraph/prototools/cmd/protoc-gen-doc"
import (
"bytes"
"encoding/xml"
"fmt"
"go/build"
"io"
"io/ioutil"
"log"
"os"
"path/filepath"
"github.com/golang/protobuf/proto"
plugin "github.com/golang/protobuf/protoc-gen-go/plugin"
"sourcegraph.com/sourcegraph/prototools/tmpl"
"sourcegraph.com/sourcegraph/prototools/util"
)
// PathDir returns the absolute path to a file given a relative one in one of
// the $GOPATH directories. If it cannot be resolved, relPath itself is
// returned.
func PathDir(relPath string) string {
// Test again each directory listed in $GOPATH
for _, path := range filepath.SplitList(build.Default.GOPATH) {
path = filepath.Join(path, relPath)
if _, err := os.Stat(path); err == nil {
return path
}
}
return relPath
}
// extendParams extends the given parameter map with the second one.
func extendParams(params, second map[string]string) map[string]string {
for k, v := range second {
if _, ok := params[k]; !ok {
params[k] = v
}
}
return params
}
var basicFileMap = `
<FileMap>
{{$templatePath := "%s"}}
{{range .ProtoFile}}
<Generate>
<Template>{{$templatePath}}</Template>
<Target>{{.Name}}</Target>
<Output>{{trimExt .Name}}{{ext $templatePath}}</Output>
</Generate>
{{end}}
</FileMap>
`
func main() {
// Configure logging.
log.SetFlags(0)
log.SetPrefix("protoc-gen-doc: ")
// Create a template generator.
g := tmpl.New()
// Read input from the protoc compiler.
data, err := ioutil.ReadAll(os.Stdin)
if err != nil {
log.Fatal(err, ": failed to read input")
}
// Unmarshal the protoc generation request.
request := &plugin.CodeGeneratorRequest{}
if err := proto.Unmarshal(data, request); err != nil {
log.Fatal(err, ": failed to parse input proto")
}
if err := g.SetRequest(request); err != nil {
log.Fatal(err, ": failed to set request")
}
if len(request.FileToGenerate) == 0 {
log.Fatal(err, ": no input files")
}
// Verify the command-line parameters.
params := util.ParseParams(request)
// Handle configuration files.
if conf, ok := params["conf"]; ok {
confData, err := ioutil.ReadFile(conf)
if err != nil {
log.Fatal(err, ": could not read conf file")
}
request.Parameter = proto.String(string(confData))
params = extendParams(params, util.ParseParams(request))
}
paramTemplate, haveTemplate := params["template"]
paramFileMap, haveFileMap := params["filemap"]
if haveTemplate && haveFileMap {
log.Fatal("expected either template or filemap argument, not both")
}
// Build the filemap based on the command-line parameters.
var fileMapDir, fileMapData string
if haveTemplate {
// Use the specified template file once on each input proto file.
fileMapData = fmt.Sprintf(basicFileMap, paramTemplate)
} else if haveFileMap {
// Load the filemap template.
data, err := ioutil.ReadFile(paramFileMap)
if err != nil {
log.Fatal(err, ": failed to read file map")
}
fileMapData = string(data)
fileMapDir = filepath.Dir(paramFileMap)
} else {
// Use the default filemap template once on each input proto file.
def := PathDir("src/sourcegraph.com/sourcegraph/prototools/templates/tmpl.html")
fileMapData = fmt.Sprintf(basicFileMap, def)
fileMapDir = filepath.Dir(def)
}
// Parse the file map template.
if err = g.ParseFileMap(fileMapDir, fileMapData); err != nil {
log.Fatal(err, ": failed to parse file map")
}
// Dump the execute filemap template, if desired.
if v, ok := params["dump-filemap"]; ok {
f, err := os.Create(v)
if err != nil {
log.Fatal(err, ": failed to crate dump file")
}
dump, err := xml.MarshalIndent(g.FileMap, "", " ")
if err != nil {
log.Fatal(err, ": failed to marshal filemap")
}
_, err = io.Copy(f, bytes.NewReader(dump))
if err != nil {
log.Fatal(err, ": failed to write dump file")
}
}
// Determine the root directory.
if v, ok := params["root"]; ok {
g.RootDir = v
} else {
g.RootDir, err = os.Getwd()
if err != nil {
log.Fatal(err)
}
}
// Map the API host, if any.
if v, ok := params["apihost"]; ok {
g.APIHost = v
}
// Perform generation.
response, err := g.Generate()
if err != nil {
log.Fatal(err, ": failed to generate")
}
// Marshal the results and write back to the protoc compiler.
data, err = proto.Marshal(response)
if err != nil {
log.Fatal(err, ": failed to marshal output proto")
}
_, err = io.Copy(os.Stdout, bytes.NewReader(data))
if err != nil {
log.Fatal(err, ": failed to write output proto")
}
}
|
LAS CRUCES - The Doña Ana County Sheriff's Office bomb squad destroyed more than 100 pounds in illegal fireworks Wednesday.
The fireworks had been seized over the Fourth of July holiday by city and county officials, and were incinerated in a controlled explosion before 10 a.m. at an area near the Southern New Mexico State Fairgrounds, west of Las Cruces.
The haul of illegal fireworks that was destroyed included a batch seized by the Las Cruces Fire Department that had an estimated value of about $5,000, sheriff's spokeswoman Kelly Jameson said.
"The Doña Ana County bomb squad is tasked each year with destroying these illegal fireworks. We work closely with the Las Cruces Fire Department to obtain a permit from the (Bureau of Alcohol, Tobacco, Firearms and Explosives)," Jameson said.
Before the explosion, the bomb squad placed the fireworks inside a deep pit, and then doused a flammable liquid over the mix of devices that included aerial spinners, helicopters, mines, missile-type rockets, Roman candles, and other fireworks.
The explosion itself produced large wisps of smoke and some flames. The fireworks were destroyed in fewer than 30 seconds.
"We pull this together in the middle of the day because this isn't exactly designed to be a show. These things are unpredictable," Jameson said. "The bomb guys will tell you this is one of the dangerous tasks that they perform each year."
She added: "We're happy to give the public an idea of what this looks like by inviting the media out today." It was the first time members of the media had been invited to the fireworks destruction.
The fireworks that were destroyed "represents just a small portion of illegal fireworks that were successfully burned in the county," according to Jameson.
Weeks after the Fourth of July, illegal fireworks remain a nuisance in the county, Jameson said. Most recently, deputies responded to an incident on Tuesday night.
"Somebody north of town ... was lighting off a Roman candle in county limits and accidentally set their car on fire," she said. |
A Novel Chewing Detection System Based on PPG, Audio, and Accelerometry In the context of dietary management, accurate monitoring of eating habits is receiving increased attention. Wearable sensors, combined with the connectivity and processing of modern smartphones, can be used to robustly extract objective and real-time measurements of human behavior. In particular, for the task of chewing detection, several approaches based on an in-ear microphone can be found in the literature, while other types of sensors have also been reported, such as strain sensors. In this paper, performed in the context of the SPLENDID project, we propose to combine an in-ear microphone with a photoplethysmography (PPG) sensor placed in the ear concha, in a new high accuracy and low sampling rate prototype chewing detection system. We propose a pipeline that initially processes each sensor signal separately, and then fuses both to perform the final detection. Features are extracted from each modality, and support vector machine (SVM) classifiers are used separately to perform snacking detection. Finally, we combine the SVM scores from both signals in a late-fusion scheme, which leads to increased eating detection accuracy. We evaluate the proposed eating monitoring system on a challenging, semifree living dataset of 14 subjects, which includes more than 60 h of audio and PPG signal recordings. Results show that fusing the audio and PPG signals significantly improves the effectiveness of eating event detection, achieving accuracy up to 0.938 and class-weighted accuracy up to 0.892. |
Roux-en-Y gastric bypass in the treatment of non-classic congenital adrenal hyperplasia due to 11-hydroxylase deficiency Non-classic adrenal hyperplasia (NCAH) has been associated with insulin resistance (IR). Therapies such as metformin, thiazolidinediones and lifestyle alterations improve IR and also ameliorate the biochemical and clinical abnormalities of NCAH, much as they do in polycystic ovarian syndrome (PCOS). More recently, bariatric surgery, such as Roux-en-Y gastric bypass (RYGBP), has also been associated with improvement in IR and amelioration of PCOS and may, therefore, be beneficial in NCAH. We report a case of a 39-year-old, deaf-mute, obese woman with NCAH due to 11-hydroxylase deficiency who underwent RYGBP followed by improvement of NCAH manifestations. She was initially treated with metformin and pioglitazone, which lowered serum 11-deoxycortisol from 198ng/dl (<51) to 26ng/dl. Five weeks after undergoing RYGBP her body mass index fell from 44.18kg/m2 to 39.54kg/m2 and, despite not taking metformin or pioglitazone, serum 11-deoxycortisol remained normal at <40ng/dl. Concurrently and subsequently, her NCAH symptoms, for example, alopecia, hirsutism and irregular menses normalised as well. We conclude that RYGBP, like other interventions that reduce IR, may be another way of treating non-classic 11-hydroxylase deficiency in selected patients. |
/**
* Definition for a binary tree node.
* public class TreeNode {
* int val;
* TreeNode left;
* TreeNode right;
* TreeNode(int x) { val = x; }
* }
*/
class Solution {
List<TreeNode> path = new ArrayList<>();
public TreeNode lowestCommonAncestor(TreeNode root, TreeNode p, TreeNode q) {
if (root == null || root == p || root == q) return root;
TreeNode left = lowestCommonAncestor(root.left, p, q);
TreeNode right = lowestCommonAncestor(root.right, p, q);
if (left != null && right != null) {
return root;
}
return left == null ? right : left;
}
public TreeNode lowestCommonAncestor2(TreeNode root, TreeNode p, TreeNode q) {
if (root == null) {
return null;
}
findPath(root, p);
List<TreeNode> pPath = new ArrayList<>(path);
path.clear();
findPath(root, q);
List<TreeNode> qPath = new ArrayList<>(path);
int n = pPath.size() - 1;
int m = qPath.size() - 1;
TreeNode first = null;
while (n >= 0 && m >= 0) {
if (qPath.get(m) != pPath.get(n)) {
break;
}
first = pPath.get(n);
n--;
m--;
}
return first;
}
public boolean findPath(TreeNode root, TreeNode n) {
if (root == null) {
return false;
}
if (root.val == n.val) {
path.add(root);
return true;
} else {
if (findPath(root.left, n) || findPath(root.right, n)) {
path.add(root);
return true;
} else {
return false;
}
}
}
} |
def _update_hydro_output(m, year, scenario):
for g in m.G_E_HYDRO:
for t in m.T:
output = self.input_traces.loc[(year, scenario), ('HYDRO', g, t)]
if output < 0.01:
output = 0
m.P_HYDRO_HISTORIC[g, t] = output |
Integrated tracking, classification, and sensor management All engineers and researchers interested in tracking and sensor fusion should scrutinize this new book consisting of 17 chapters in 712 pages written by the world's experts on the subject. The book is a pleasure to read, and it is a cornucopia of practical algorithms and new theory with quantitative performance comparisons. For example, chapter 8 by Neil Gordon, et al. surveys the state-of-the-art in track-before-detect algorithms, including a thorough quantitative comparison of 5 classes of algorithms (Viterbi, Baum-Welch, particle filters, maximum likelihood PDA and histogram probabilistic MHT), showing ROC curves, one-sigma errors and computational complexity for various signal-to-noise ratios and scenarios and sensors; the algorithms are thoroughly described in clear accessible prose; an extensive list of references is included. Most chapters in this book are written at a similar high level of quality and clarity and thoroughness. |
<filename>UXToolsGame/Source/UXToolsTests/Tests/ManipulatorConstraint.spec.cpp
// Copyright (c) 2020 Microsoft Corporation.
// Licensed under the MIT License.
#include "Engine.h"
#include "FrameQueue.h"
#include "GenericManipulatorTestComponent.h"
#include "UxtTestHand.h"
#include "UxtTestHandTracker.h"
#include "UxtTestUtils.h"
#include "Components/SceneComponent.h"
#include "Input/UxtFarPointerComponent.h"
#include "Input/UxtNearPointerComponent.h"
#include "Interactions/Constraints/UxtFaceUserConstraint.h"
#include "Interactions/Constraints/UxtFixedDistanceConstraint.h"
#include "Interactions/Constraints/UxtFixedRotationToUserConstraint.h"
#include "Interactions/Constraints/UxtFixedRotationToWorldConstraint.h"
#include "Interactions/Constraints/UxtMaintainApparentSizeConstraint.h"
#include "Interactions/Constraints/UxtMoveAxisConstraint.h"
#include "Interactions/Constraints/UxtRotationAxisConstraint.h"
#include "Interactions/UxtGenericManipulatorComponent.h"
#include "Tests/AutomationCommon.h"
#include "Utils/UxtFunctionLibrary.h"
#if WITH_DEV_AUTOMATION_TESTS
namespace
{
const FVector TargetLocation(150, 0, 0);
UUxtGenericManipulatorComponent* CreateTestComponent()
{
UWorld* World = UxtTestUtils::GetTestWorld();
AActor* Actor = World->SpawnActor<AActor>();
// Box Mesh
UStaticMeshComponent* Mesh = UxtTestUtils::CreateBoxStaticMesh(Actor);
Actor->SetRootComponent(Mesh);
Mesh->RegisterComponent();
// Generic manipulator component
UUxtGenericManipulatorComponent* Manipulator = NewObject<UUxtGenericManipulatorComponent>(Actor);
Manipulator->OneHandRotationMode = EUxtOneHandRotationMode::RotateAboutObjectCenter;
Manipulator->SetSmoothing(0.0f);
Manipulator->RegisterComponent();
Actor->SetActorLocation(TargetLocation);
return Manipulator;
}
FRotator GetCameraRotation(UWorld* World)
{
APlayerCameraManager* CameraManager = UGameplayStatics::GetPlayerCameraManager(World, 0);
return CameraManager->GetCameraRotation();
}
} // namespace
BEGIN_DEFINE_SPEC(
ManipulatorConstraintSpec, "UXTools.GenericManipulator.Constraints",
EAutomationTestFlags::ProductFilter | EAutomationTestFlags::ApplicationContextMask)
void EnqueueFixedRotationToWorldConstraintTests();
void EnqueueFixedRotationToUserConstraintTests();
void EnqueueFaceUserConstraintTests();
void EnqueueRotationAxisConstraintTests();
void EnqueueFixedDistanceConstraintTests();
void EnqueueMaintainApparentSizeConstraintTests();
void EnqueueMoveAxisConstraintTests();
UUxtGenericManipulatorComponent* Target;
UUxtTransformConstraint* Constraint;
FFrameQueue FrameQueue;
// Must be configured by Describe block if needed
EUxtInteractionMode InteractionMode;
FUxtTestHand LeftHand = FUxtTestHand(EControllerHand::Left);
FUxtTestHand RightHand = FUxtTestHand(EControllerHand::Right);
// Cache for a position to use between frames
FVector PositionCache;
END_DEFINE_SPEC(ManipulatorConstraintSpec)
void ManipulatorConstraintSpec::Define()
{
BeforeEach([this] {
TestTrueExpr(AutomationOpenMap(TEXT("/Game/UXToolsGame/Tests/Maps/TestEmpty")));
UWorld* World = UxtTestUtils::GetTestWorld();
FrameQueue.Init(&World->GetGameInstance()->GetTimerManager());
UxtTestUtils::EnableTestHandTracker();
Target = CreateTestComponent();
});
AfterEach([this] {
Target->GetOwner()->Destroy();
Target = nullptr;
Constraint = nullptr;
UxtTestUtils::DisableTestHandTracker();
FrameQueue.Reset();
});
Describe("Constraint Selection", [this] {
BeforeEach([this] {
InteractionMode = EUxtInteractionMode::Near;
RightHand.Configure(InteractionMode, TargetLocation);
UUxtMoveAxisConstraint* MoveConstraint = NewObject<UUxtMoveAxisConstraint>(Target->GetOwner());
MoveConstraint->ConstraintOnMovement = static_cast<uint32>(EUxtAxisFlags::X | EUxtAxisFlags::Y | EUxtAxisFlags::Z);
MoveConstraint->RegisterComponent();
UUxtRotationAxisConstraint* RotationConstraint = NewObject<UUxtRotationAxisConstraint>(Target->GetOwner());
RotationConstraint->AllowedAxis = EUxtAxis::None;
RotationConstraint->RegisterComponent();
Constraint = MoveConstraint;
});
AfterEach([this] { RightHand.Reset(); });
LatentIt("should automatically detect all constraints", [this](const FDoneDelegate& Done) {
Target->SetAutoDetectConstraints(true);
const FTransform InitialTransform = Target->GetOwner()->GetTransform();
FrameQueue.Enqueue([this] { RightHand.SetGrabbing(true); });
FrameQueue.Enqueue([this] {
TestTrue("Component is grabbed", Target->GetGrabPointers().Num() > 0);
RightHand.Translate(FVector(10, 10, 10));
RightHand.Rotate(FQuat(FVector::OneVector, FMath::DegreesToRadians(90)));
});
// Skip a frame to ensure the manipulator has updated the object.
FrameQueue.Skip();
FrameQueue.Enqueue([this, InitialTransform] {
const FTransform Result = Target->GetOwner()->GetTransform();
TestEqual("The movement constraint was applied", Result.GetLocation(), InitialTransform.GetLocation());
TestEqual("The rotation constraint was applied", Result.GetRotation(), InitialTransform.GetRotation());
});
FrameQueue.Enqueue([Done] { Done.Execute(); });
});
LatentIt("should only use selected constraints", [this](const FDoneDelegate& Done) {
FComponentReference ConstraintReference;
ConstraintReference.OverrideComponent = Constraint;
Target->SetAutoDetectConstraints(false);
Target->AddConstraint(ConstraintReference);
const FTransform InitialTransform = Target->GetOwner()->GetTransform();
FrameQueue.Enqueue([this] { RightHand.SetGrabbing(true); });
FrameQueue.Enqueue([this] {
TestTrue("Component is grabbed", Target->GetGrabPointers().Num() > 0);
RightHand.Translate(FVector(10, 10, 10));
RightHand.Rotate(FQuat(FVector::OneVector, FMath::DegreesToRadians(90)));
});
// Skip a frame to ensure the manipulator has updated the object.
FrameQueue.Skip();
FrameQueue.Enqueue([this, InitialTransform] {
const FTransform Result = Target->GetOwner()->GetTransform();
TestEqual("The movement constraint was applied", Result.GetLocation(), InitialTransform.GetLocation());
TestNotEqual("The rotation constraint was not applied", Result.GetRotation(), InitialTransform.GetRotation());
});
FrameQueue.Enqueue([Done] { Done.Execute(); });
});
});
Describe("Near Interaction", [this] {
BeforeEach([this] {
InteractionMode = EUxtInteractionMode::Near;
LeftHand.Configure(InteractionMode, TargetLocation);
RightHand.Configure(InteractionMode, TargetLocation);
});
AfterEach([this] {
LeftHand.Reset();
RightHand.Reset();
});
EnqueueFixedRotationToWorldConstraintTests();
EnqueueFixedRotationToUserConstraintTests();
EnqueueFaceUserConstraintTests();
EnqueueRotationAxisConstraintTests();
EnqueueFixedDistanceConstraintTests();
EnqueueMaintainApparentSizeConstraintTests();
EnqueueMoveAxisConstraintTests();
});
Describe("Far Interaction", [this] {
BeforeEach([this] {
InteractionMode = EUxtInteractionMode::Far;
LeftHand.Configure(InteractionMode, TargetLocation);
RightHand.Configure(InteractionMode, TargetLocation);
});
AfterEach([this] {
LeftHand.Reset();
RightHand.Reset();
});
EnqueueFixedRotationToWorldConstraintTests();
EnqueueFixedRotationToUserConstraintTests();
EnqueueFaceUserConstraintTests();
EnqueueRotationAxisConstraintTests();
EnqueueFixedDistanceConstraintTests();
EnqueueMaintainApparentSizeConstraintTests();
EnqueueMoveAxisConstraintTests();
});
}
void ManipulatorConstraintSpec::EnqueueFixedRotationToWorldConstraintTests()
{
Describe("UxtFixedRotationToWorldConstraint", [this] {
BeforeEach([this] {
Constraint = NewObject<UUxtFixedRotationToWorldConstraint>(Target->GetOwner());
Constraint->RegisterComponent();
});
LatentIt("Should maintain fixed rotation to world with one hand", [this](const FDoneDelegate& Done) {
const FTransform ExpectedTransform = Target->GetOwner()->GetTransform();
FrameQueue.Enqueue([this] { RightHand.SetGrabbing(true); });
FrameQueue.Enqueue([this] {
TestTrue("Component is grabbed", Target->GetGrabPointers().Num() > 0);
RightHand.Rotate(FQuat(FVector::ForwardVector, FMath::DegreesToRadians(90)));
});
// Skip a frame to ensure the manipulator has updated the object.
FrameQueue.Skip();
FrameQueue.Enqueue([this, ExpectedTransform] {
const FTransform Result = Target->GetOwner()->GetTransform();
TestTrue("Objects rotation didn't change with hand rotation", Result.Equals(ExpectedTransform));
});
FrameQueue.Enqueue([Done] { Done.Execute(); });
});
LatentIt("Should maintain fixed rotation to world with two hands", [this](const FDoneDelegate& Done) {
const FTransform ExpectedTransform = Target->GetOwner()->GetTransform();
FrameQueue.Enqueue([this] {
LeftHand.Translate(FVector(0, -50, 0));
RightHand.Translate(FVector(0, 50, 0));
LeftHand.SetGrabbing(true);
RightHand.SetGrabbing(true);
});
FrameQueue.Enqueue([this] {
TestTrue("Component is grabbed", Target->GetGrabPointers().Num() == 2);
LeftHand.Translate(FVector(0, 50, -50));
RightHand.Translate(FVector(0, -50, 50));
});
// Skip a frame to ensure the manipulator has updated the object.
FrameQueue.Skip();
FrameQueue.Enqueue([this, ExpectedTransform] {
const FTransform Result = Target->GetOwner()->GetTransform();
TestTrue("Objects rotation didn't change with hands rotation", Result.Equals(ExpectedTransform));
});
FrameQueue.Enqueue([Done] { Done.Execute(); });
});
});
}
void ManipulatorConstraintSpec::EnqueueFixedRotationToUserConstraintTests()
{
Describe("UxtFixedRotationToUserConstraint", [this] {
BeforeEach([this] {
UUxtFixedRotationToUserConstraint* FixedRotationToUserConstraint =
NewObject<UUxtFixedRotationToUserConstraint>(Target->GetOwner());
FixedRotationToUserConstraint->bExcludeRoll = false;
Constraint = FixedRotationToUserConstraint;
Constraint->RegisterComponent();
});
LatentIt("Should maintain fixed rotation to user with one hand", [this](const FDoneDelegate& Done) {
const FTransform& TransformTarget = Target->TransformTarget->GetComponentTransform();
const FTransform ExpectedTransformAfterHandRotation = TransformTarget;
// store relative rotation to camera
const FRotator CameraRotation = GetCameraRotation(Constraint->GetWorld());
const FQuat RelativeRotationToCameraStart = CameraRotation.Quaternion().Inverse() * TransformTarget.GetRotation();
const FQuat HeadTilt = FQuat(FVector::ForwardVector, FMath::DegreesToRadians(90));
USceneComponent* CameraController = UxtTestUtils::CreateTestCamera(Constraint->GetWorld());
FrameQueue.Enqueue([this] { RightHand.SetGrabbing(true); });
FrameQueue.Enqueue([this, HeadTilt] {
TestTrue("Component is grabbed", Target->GetGrabPointers().Num() > 0);
RightHand.Rotate(HeadTilt);
});
// Skip a frame to ensure the manipulator has updated the object.
FrameQueue.Skip();
FrameQueue.Enqueue([this, ExpectedTransformAfterHandRotation, CameraController, HeadTilt] {
const FTransform& TransformTarget = Target->TransformTarget->GetComponentTransform();
TestTrue(
"Objects rotation didn't change with hand rotation",
TransformTarget.GetRotation().Rotator().Equals(ExpectedTransformAfterHandRotation.Rotator()));
// tilt head
CameraController->SetRelativeRotation(HeadTilt);
});
// Skip a frame to ensure the manipulator has updated the object.
FrameQueue.Skip();
FrameQueue.Enqueue([this, RelativeRotationToCameraStart, HeadTilt, CameraController] {
// check if our object rotated and if the rotation relative to head / camera is still the same
const FTransform Result = Target->TransformTarget->GetComponentTransform();
TestTrue("Objects rotation changed with head rotation", HeadTilt.Rotator().Equals(Result.Rotator()));
const FRotator CameraRotation = GetCameraRotation(Constraint->GetWorld());
const FQuat RelativeRotationToCamera = CameraRotation.Quaternion().Inverse() * Result.GetRotation();
TestTrue(
"Objects rotation relative to camera / head stayed the same",
RelativeRotationToCameraStart.Rotator().Equals(RelativeRotationToCamera.Rotator()));
CameraController->GetOwner()->Destroy();
});
FrameQueue.Enqueue([Done] { Done.Execute(); });
});
LatentIt("Should maintain fixed rotation to user with two hands", [this](const FDoneDelegate& Done) {
const FQuat ExpectedRotation(FVector::ForwardVector, FMath::DegreesToRadians(90));
const FTransform& TransformTarget = Target->TransformTarget->GetComponentTransform();
const FTransform ExpectedTransformAfterHandRotation = TransformTarget;
// store relative rotation to camera
const FRotator CameraRotation = GetCameraRotation(Constraint->GetWorld());
const FQuat RelativeRotationToCameraStart = CameraRotation.Quaternion().Inverse() * TransformTarget.GetRotation();
const FQuat HeadTilt = FQuat(FVector::ForwardVector, FMath::DegreesToRadians(90));
USceneComponent* CameraController = UxtTestUtils::CreateTestCamera(Constraint->GetWorld());
FrameQueue.Enqueue([this] {
LeftHand.Translate(FVector(0, -50, 0));
RightHand.Translate(FVector(0, 50, 0));
LeftHand.SetGrabbing(true);
RightHand.SetGrabbing(true);
});
FrameQueue.Enqueue([this] {
TestTrue("Component is grabbed", Target->GetGrabPointers().Num() == 2);
LeftHand.Translate(FVector(0, 50, -50));
RightHand.Translate(FVector(0, -50, 50));
});
// Skip a frame to ensure the manipulator has updated the object.
FrameQueue.Skip();
FrameQueue.Enqueue([this, ExpectedTransformAfterHandRotation, CameraController, HeadTilt] {
const FTransform& TransformTarget = Target->TransformTarget->GetComponentTransform();
TestTrue(
"Objects rotation didn't change with hand rotation",
TransformTarget.GetRotation().Rotator().Equals(ExpectedTransformAfterHandRotation.Rotator()));
// tilt head
CameraController->SetRelativeRotation(HeadTilt);
});
// Skip a frame to ensure the manipulator has updated the object.
FrameQueue.Skip();
FrameQueue.Enqueue([this, RelativeRotationToCameraStart, HeadTilt, CameraController] {
// check if our object rotated and if the rotation relative to head / camera is still the same
const FTransform Result = Target->TransformTarget->GetComponentTransform();
FQuat targetRot = Result.GetRotation();
TestTrue("Objects rotation changed with head rotation", HeadTilt.Rotator().Equals(Result.Rotator()));
const FRotator CameraRotation = GetCameraRotation(Constraint->GetWorld());
FQuat CameraRot = CameraRotation.Quaternion();
const FQuat RelativeRotationToCamera = CameraRotation.Quaternion().Inverse() * Result.GetRotation();
TestTrue(
"Objects rotation relative to camera / head stayed the same",
RelativeRotationToCameraStart.Rotator().Equals(RelativeRotationToCamera.Rotator()));
CameraController->GetOwner()->Destroy();
});
FrameQueue.Enqueue([Done] { Done.Execute(); });
});
});
}
void ManipulatorConstraintSpec::EnqueueFaceUserConstraintTests()
{
Describe("UxtFaceUserConstraint", [this] {
BeforeEach([this] {
Constraint = NewObject<UUxtFaceUserConstraint>(Target->GetOwner());
Constraint->RegisterComponent();
});
LatentIt("Should face user with one hand", [this](const FDoneDelegate& Done) {
FrameQueue.Enqueue([this] { RightHand.SetGrabbing(true); });
FrameQueue.Enqueue([this] {
TestTrue("Component is grabbed", Target->GetGrabPointers().Num() > 0);
RightHand.Rotate(FQuat(FVector::ForwardVector, FMath::DegreesToRadians(90)));
});
// Skip a frame to ensure the manipulator has updated the object.
FrameQueue.Skip();
FrameQueue.Enqueue([this] {
const FTransform& TransformTarget = Target->TransformTarget->GetComponentTransform();
FVector DirectionToTarget =
TransformTarget.GetLocation() - UUxtFunctionLibrary::GetHeadPose(Constraint->GetWorld()).GetLocation();
FQuat OrientationFacingTarget = FRotationMatrix::MakeFromXZ(-DirectionToTarget, FVector::UpVector).ToQuat();
TestTrue("Object is facing user", TransformTarget.GetRotation().Rotator().Equals(OrientationFacingTarget.Rotator()));
});
FrameQueue.Enqueue([Done] { Done.Execute(); });
});
LatentIt("Should face away from user with two hands", [this](const FDoneDelegate& Done) {
UUxtFaceUserConstraint* FaceUserConstraint = Cast<UUxtFaceUserConstraint>(Constraint);
FaceUserConstraint->bFaceAway = true;
FrameQueue.Enqueue([this] {
LeftHand.Translate(FVector(0, -50, 0));
RightHand.Translate(FVector(0, 50, 0));
LeftHand.SetGrabbing(true);
RightHand.SetGrabbing(true);
});
FrameQueue.Enqueue([this] {
TestTrue("Component is grabbed", Target->GetGrabPointers().Num() == 2);
LeftHand.Translate(FVector(50, 50, -50));
RightHand.Translate(FVector(-50, -50, 50));
});
// Skip a frame to ensure the manipulator has updated the object.
FrameQueue.Skip();
FrameQueue.Enqueue([this] {
const FTransform& TransformTarget = Target->TransformTarget->GetComponentTransform();
FVector DirectionToTarget =
TransformTarget.GetLocation() - UUxtFunctionLibrary::GetHeadPose(Constraint->GetWorld()).GetLocation();
FQuat OrientationFacingTarget = FRotationMatrix::MakeFromXZ(DirectionToTarget, FVector::UpVector).ToQuat();
TestTrue(
"Object is facing away from user", TransformTarget.GetRotation().Rotator().Equals(OrientationFacingTarget.Rotator()));
});
FrameQueue.Enqueue([Done] { Done.Execute(); });
});
});
}
void ManipulatorConstraintSpec::EnqueueRotationAxisConstraintTests()
{
Describe("UxtRotationAxisConstraint", [this] {
BeforeEach([this] {
UUxtRotationAxisConstraint* RotationAxisConstraint = NewObject<UUxtRotationAxisConstraint>(Target->GetOwner());
RotationAxisConstraint->AllowedAxis = EUxtAxis::Z;
Constraint = RotationAxisConstraint;
Constraint->RegisterComponent();
});
LatentIt("Should restrict rotation in X axis with one hand", [this](const FDoneDelegate& Done) {
const FRotator ExpectedRotation = Target->TransformTarget->GetComponentTransform().Rotator();
FrameQueue.Enqueue([this] { RightHand.SetGrabbing(true); });
FrameQueue.Enqueue([this] {
TestTrue("Component is grabbed", Target->GetGrabPointers().Num() > 0);
RightHand.Rotate(FQuat(FVector::ForwardVector, FMath::DegreesToRadians(90)));
});
// Skip a frame to ensure the manipulator has updated the object.
FrameQueue.Skip();
FrameQueue.Enqueue([this, ExpectedRotation] {
const FTransform& TransformTarget = Target->TransformTarget->GetComponentTransform();
TestTrue("Objects rotation didn't change with hand rotation", TransformTarget.Rotator().Equals(ExpectedRotation));
});
FrameQueue.Enqueue([Done] { Done.Execute(); });
});
LatentIt("Should restrict rotation in X axis with two hands", [this](const FDoneDelegate& Done) {
const FRotator ExpectedRotation = Target->TransformTarget->GetComponentTransform().Rotator();
FrameQueue.Enqueue([this] {
LeftHand.Translate(FVector(0, -50, 0));
RightHand.Translate(FVector(0, 50, 0));
LeftHand.SetGrabbing(true);
RightHand.SetGrabbing(true);
});
FrameQueue.Enqueue([this] {
TestTrue("Component is grabbed", Target->GetGrabPointers().Num() == 2);
LeftHand.Translate(FVector(0, 50, -50));
RightHand.Translate(FVector(0, -50, 50));
});
// Skip a frame to ensure the manipulator has updated the object.
FrameQueue.Skip();
FrameQueue.Enqueue([this, ExpectedRotation] {
const FTransform Result = Target->GetOwner()->GetTransform();
const FRotator ResultRotation = Result.Rotator();
TestTrue("Objects rotation didn't change with hands rotation", ExpectedRotation.Equals(ResultRotation));
});
FrameQueue.Enqueue([Done] { Done.Execute(); });
});
});
}
void ManipulatorConstraintSpec::EnqueueFixedDistanceConstraintTests()
{
Describe("UxtFixedDistanceConstraint", [this] {
BeforeEach([this] {
Constraint = NewObject<UUxtFixedDistanceConstraint>(Target->GetOwner());
Constraint->RegisterComponent();
});
LatentIt("Should maintain fixed distance to camera", [this](const FDoneDelegate& Done) {
const FTransform HeadPose = UUxtFunctionLibrary::GetHeadPose(UxtTestUtils::GetTestWorld());
const float ExpectedDistance = FVector::Dist(Target->GetOwner()->GetActorLocation(), HeadPose.GetLocation());
FrameQueue.Enqueue([this] { RightHand.SetGrabbing(true); });
FrameQueue.Enqueue([this] {
TestTrue("Component is grabbed", Target->GetGrabPointers().Num() > 0);
RightHand.Translate(FVector(100, 100, 100));
});
// Skip a frame to ensure the manipulator has updated the object.
FrameQueue.Skip();
FrameQueue.Enqueue([this, HeadPose, ExpectedDistance] {
const float Result = FVector::Dist(Target->GetOwner()->GetActorLocation(), HeadPose.GetLocation());
TestEqual("Distance did not change", Result, ExpectedDistance);
});
FrameQueue.Enqueue([Done] { Done.Execute(); });
});
LatentIt("Should maintain fixed distance to object", [this](const FDoneDelegate& Done) {
AActor* ConstraintObject = UxtTestUtils::GetTestWorld()->SpawnActor<AActor>();
USceneComponent* RootComponent = NewObject<USceneComponent>(ConstraintObject);
RootComponent->SetWorldLocation(TargetLocation + FVector(50, 50, 50));
ConstraintObject->SetRootComponent(RootComponent);
UUxtFixedDistanceConstraint* FixedDistanceConstraint = Cast<UUxtFixedDistanceConstraint>(Constraint);
FixedDistanceConstraint->ConstraintComponent.OtherActor = ConstraintObject;
const float ExpectedDistance = FVector::Dist(Target->GetOwner()->GetActorLocation(), ConstraintObject->GetActorLocation());
FrameQueue.Enqueue([this] { RightHand.SetGrabbing(true); });
FrameQueue.Enqueue([this] {
TestTrue("Component is grabbed", Target->GetGrabPointers().Num() > 0);
RightHand.Translate(FVector(100, 100, 100));
});
// Skip a frame to ensure the manipulator has updated the object.
FrameQueue.Skip();
FrameQueue.Enqueue([this, ConstraintObject, ExpectedDistance] {
const float Result = FVector::Dist(Target->GetOwner()->GetActorLocation(), ConstraintObject->GetActorLocation());
TestEqual("Distance did not change", Result, ExpectedDistance);
});
FrameQueue.Enqueue([Done] { Done.Execute(); });
ConstraintObject->Destroy();
});
});
}
void ManipulatorConstraintSpec::EnqueueMaintainApparentSizeConstraintTests()
{
Describe("UxtMaintainApparentSizeConstraint", [this] {
BeforeEach([this] {
Constraint = NewObject<UUxtMaintainApparentSizeConstraint>(Target->GetOwner());
Constraint->RegisterComponent();
});
LatentIt("Should maintain apparent size with one hand", [this](const FDoneDelegate& Done) {
const FVector Translation = FVector::ForwardVector * 200;
const FTransform HeadPose = UUxtFunctionLibrary::GetHeadPose(UxtTestUtils::GetTestWorld());
const float InitialDistance = FVector::Dist(Target->GetOwner()->GetActorLocation(), HeadPose.GetLocation());
const float DistanceScaling = InteractionMode == EUxtInteractionMode::Far ? 3 : 1;
const float ExpectedDistance = InitialDistance + (Translation.Size() * DistanceScaling);
const FVector ExpectedScale = (ExpectedDistance / InitialDistance) * Target->GetOwner()->GetActorScale();
FrameQueue.Enqueue([this] { RightHand.SetGrabbing(true); });
FrameQueue.Enqueue([this, Translation] {
TestTrue("Component is grabbed", Target->GetGrabPointers().Num() > 0);
RightHand.Translate(Translation);
});
// Skip a frame to ensure the manipulator has updated the object.
FrameQueue.Skip();
FrameQueue.Enqueue([this, ExpectedScale] {
const FVector Result = Target->GetOwner()->GetActorScale();
TestEqual("Should have scaled accordingly", Result, ExpectedScale, 0.1);
});
FrameQueue.Enqueue([Done] { Done.Execute(); });
});
LatentIt("Should maintain apparent size with two hands", [this](const FDoneDelegate& Done) {
FrameQueue.Enqueue([this] {
LeftHand.Translate(FVector(0, -50, 0));
RightHand.Translate(FVector(0, 50, 0));
LeftHand.SetGrabbing(true);
RightHand.SetGrabbing(true);
});
FrameQueue.Enqueue([this] {
TestTrue("Component is grabbed", Target->GetGrabPointers().Num() > 0);
LeftHand.Translate(FVector(0, 50, -50));
RightHand.Translate(FVector(0, -50, 50));
});
// Skip a frame to ensure the manipulator has updated the object.
FrameQueue.Skip();
FrameQueue.Enqueue([this] {
const FVector Result = Target->GetOwner()->GetActorScale();
TestEqual("Should have scaled accordingly", Result, FVector(1, 1, 1), 0.1);
});
FrameQueue.Enqueue([Done] { Done.Execute(); });
});
});
}
void ManipulatorConstraintSpec::EnqueueMoveAxisConstraintTests()
{
Describe("UxtMoveAxisConstraint", [this] {
BeforeEach([this] {
UUxtMoveAxisConstraint* MoveAxisConstraint = NewObject<UUxtMoveAxisConstraint>(Target->GetOwner());
MoveAxisConstraint->ConstraintOnMovement = static_cast<int32>(EUxtAxisFlags::X);
Constraint = MoveAxisConstraint;
Constraint->RegisterComponent();
});
LatentIt("should restrict movement in X direction", [this](const FDoneDelegate& Done) {
FrameQueue.Enqueue([this, Done] { RightHand.SetGrabbing(true); });
FrameQueue.Enqueue([this] { RightHand.Translate(FVector(200, 200, 200)); });
FrameQueue.Enqueue([this] {
const FVector NewLocation = Target->GetOwner()->GetActorLocation();
const FVector ExpectedLocation = TargetLocation + FVector(0, 200, 200);
TestEqual(TEXT("Object didn't move as expected"), NewLocation, ExpectedLocation);
});
FrameQueue.Enqueue([Done] { Done.Execute(); });
});
LatentIt("should apply constraints that get added during runtime", [this](const FDoneDelegate& Done) {
FrameQueue.Enqueue([this, Done] {
Constraint->DestroyComponent();
Constraint = nullptr;
});
FrameQueue.Enqueue([this] { RightHand.SetGrabbing(true); });
FrameQueue.Enqueue([this] { RightHand.Translate(FVector(200, 200, 200)); });
FrameQueue.Enqueue([this] {
PositionCache = Target->GetOwner()->GetActorLocation();
UUxtMoveAxisConstraint* MoveAxisConstraint = NewObject<UUxtMoveAxisConstraint>(Target->GetOwner());
MoveAxisConstraint->ConstraintOnMovement = static_cast<int32>(EUxtAxisFlags::X);
Constraint = MoveAxisConstraint;
Constraint->RegisterComponent();
});
FrameQueue.Enqueue([this] { RightHand.Translate(FVector(-200, -200, -200)); });
FrameQueue.Enqueue([this] {
const FVector NewLocation = Target->GetOwner()->GetActorLocation();
const FVector ExpectedLocation = FVector(PositionCache.X, 0, 0);
TestEqual(TEXT("Object didn't move as expected"), NewLocation, ExpectedLocation);
});
FrameQueue.Enqueue([Done] { Done.Execute(); });
});
LatentIt("should react on constraint component detach during runtime", [this](const FDoneDelegate& Done) {
FrameQueue.Enqueue([this, Done] { RightHand.SetGrabbing(true); });
FrameQueue.Enqueue([this] {
Constraint->DestroyComponent();
Constraint = nullptr;
});
FrameQueue.Enqueue([this] { RightHand.Translate(FVector(200, 200, 200)); });
FrameQueue.Enqueue([this] {
const FVector NewLocation = Target->GetOwner()->GetActorLocation();
const FVector ConstraintLocation = TargetLocation + FVector(0, 200, 200);
TestNotEqual(TEXT("object still had constraint applied"), NewLocation, ConstraintLocation);
TestTrue(TEXT("object did not move as expected in x direciton"), NewLocation.X >= (TargetLocation.X + 200));
});
FrameQueue.Enqueue([Done] { Done.Execute(); });
});
LatentIt("should restrict movement in X direction local space", [this](const FDoneDelegate& Done) {
FrameQueue.Enqueue([this, Done] {
Target->GetOwner()->SetActorRotation(FQuat(FVector::RightVector, FMath::DegreesToRadians(90)));
UUxtMoveAxisConstraint* MoveAxisConstraint = Cast<UUxtMoveAxisConstraint>(Constraint);
MoveAxisConstraint->ConstraintOnMovement = static_cast<int32>(EUxtAxisFlags::X);
MoveAxisConstraint->bUseLocalSpaceForConstraint = true;
RightHand.SetGrabbing(true);
});
FrameQueue.Enqueue([this] { RightHand.Translate(FVector(200, 200, 200)); });
FrameQueue.Enqueue([this, Done] {
const FVector NewLocation = Target->GetOwner()->GetActorLocation();
// due to rotated object locking local X will be global Z
const FVector ExpectedLocation = FVector(NewLocation.X, NewLocation.Y, 0);
TestEqual(TEXT("object didn't move as expected"), NewLocation, ExpectedLocation, 0.001f);
});
FrameQueue.Enqueue([Done] { Done.Execute(); });
});
LatentIt("should allow multiple constraints", [this](const FDoneDelegate& Done) {
FrameQueue.Enqueue([this, Done] {
UUxtMoveAxisConstraint* MoveAxisConstraint = NewObject<UUxtMoveAxisConstraint>(Target->GetOwner());
MoveAxisConstraint->ConstraintOnMovement = static_cast<int32>(EUxtAxisFlags::Y);
MoveAxisConstraint->RegisterComponent();
});
FrameQueue.Enqueue([this] { RightHand.SetGrabbing(true); });
FrameQueue.Enqueue([this] { RightHand.Translate(FVector(200, 200, 200)); });
FrameQueue.Enqueue([this, Done] {
const FVector NewLocation = Target->GetOwner()->GetActorLocation();
const FVector ExpectedLocation = TargetLocation + FVector(0, 0, 200);
TestEqual(TEXT("object didn't move as expected"), NewLocation, ExpectedLocation);
});
FrameQueue.Enqueue([Done] { Done.Execute(); });
});
LatentIt("should restrict movement for two hand interaction", [this](const FDoneDelegate& Done) {
FrameQueue.Enqueue([this] {
LeftHand.Translate(FVector(0, 0, 5));
RightHand.SetGrabbing(true);
LeftHand.SetGrabbing(true);
});
FrameQueue.Enqueue([this] {
RightHand.Translate(FVector(200, 200, 200));
LeftHand.Translate(FVector(200, 200, 200));
});
FrameQueue.Enqueue([this] {
const FVector NewLocation = Target->GetOwner()->GetActorLocation();
const FVector ExpectedLocation = TargetLocation + FVector(0, 200, 200);
TestEqual(TEXT("object didn't move as expected"), NewLocation, ExpectedLocation);
// Add another constraint that applies only to one handed movement
UUxtMoveAxisConstraint* MoveAxisConstraint = NewObject<UUxtMoveAxisConstraint>(Target->GetOwner());
MoveAxisConstraint->ConstraintOnMovement = static_cast<int32>(EUxtAxisFlags::Y);
MoveAxisConstraint->HandType = static_cast<int32>(EUxtGrabMode::OneHanded);
MoveAxisConstraint->RegisterComponent();
});
FrameQueue.Enqueue([this, Done] {
RightHand.Translate(FVector(200, 200, 200));
LeftHand.Translate(FVector(200, 200, 200));
});
FrameQueue.Enqueue([this, Done] {
const FVector NewLocation = Target->GetOwner()->GetActorLocation();
const FVector ExpectedLocation = TargetLocation + FVector(0, 400, 400);
TestEqual(TEXT("object didn't move as expected"), NewLocation, ExpectedLocation);
});
FrameQueue.Enqueue([Done] { Done.Execute(); });
});
});
}
#endif // WITH_DEV_AUTOMATION_TESTS
|
The invention relates to a power mover for use in a mobile irrigating system and more particularly to a power mover connected in the center of a line of coupled wheel-mounted irrigating pipes.
Various types of power movers are used in mobile irrigating systems. In some such systems a plurality of hydraulic motors are provided along a length of the coupled irrigating pipes to move the pipes. In other systems lengths of wheel-mounted irrigating pipes which carry sprinkler heads are coupled together in a long line to be rolled across the field to be irrigated. In order to provide the motive power to rotate the line of wheeled irrigating pipes, a power unit is connected in the middle of the line of irrigating pipes.
The centralized power unit applies a torque to the wheel-mounted pipes through a motor-driven bull gear which is mounted on a wheeled carriage which straddles the pipe. Driving power is usually also provided to the wheels of the carriage as well as to the bull gear.
All prior art mobile irrigating systems suffer from a number of disadvantages involving loss of power, maintenance, or inconvenient controls. For example, in order to allow the driving torque to be reversed in some systems the motor turns the bull gear through a hydraulic drive assembly. One problem in such types of movers is that there can be a high degree of maintenance required in the hydraulic pump for it to operate properly. In still other systems rotational power from a motor at the side end of the pipes is coupled to two or more planetary transmissions on the central unit via a drive shaft which extends along the length of coupled pipes. One disadvantage of this type of system is in frictional losses, particularly when the line of irrigating pipes becomes misaligned or slightly bent as the system is moved across the field.
In systems having a centralized power mover with an internal combustion engine the operator must walk out to the midpoint of the line of pipes, which may be as long as 1/8 to 1/4 of a mile, and then start the engine in the desired direction and walk with the assembly until it has reached the new position. He must then stop the engine and walk back out to the end of the line of pipes and couple it up to the water supply system. In farm operations where labor is scarce or expensive, such as requirement becomes highly undesirable because it needlessly consumes a relatively large amount of time by the worker.
In many mobile irrigation systems, since the sprinkler heads are attached directly to the lengths of rotating pipe, alignment can be a critical problem since the entire line of coupled pipes must be rotated to the new position and stopped at exactly the point where the sprinkler heads are pointed in the proper direction so that they will properly irrigate the field. This sometimes requires a certain amount of jockeying of the power mover. In systems having hydraulic drives this is accomplished by reversing the flow of hydraulic fluid to the drive. In systems utilizing a direct drive, internal combustion engine, however, this jockeying is much more difficult to carry out. One system attempts to overcome this problem by utilizing the technique described above of a drive shaft mounted on the pipes to transmit power to the central unit from a hand-held engine connected to the drive shaft at the end of the coupled pipes nearest the side of the field. The system is made reversible simply by providing couplings on both ends of the engine drive shaft. To reverse the direction of travel the engine is uncoupled from the pipe-mounted drive shaft, turned 180.degree. and recoupled to the pipe-mounted drive shaft. A major disadvantage of this system is that the engine must be physically lifted by the operator, thus precluding its operation by most women and children. Since women and older children often work on family-owned farms, this advantage makes the use of such a system on a family-owned farm uneconomical.
It is necessary for all these systems to operate over an entire season with a minimal amount of maintenance. Electric drive motors are thus extremely impractical, first because of the difficulty in properly insulating them from the water spray, and secondly because of the weight of the batteries which would be required to power the unit through an entire season. Even with an internal combustion engine the problem in some systems of keeping the unit powered throughout an entire season can be vexing since fuel must usually be carried out to the middle of the field. |
#! /usr/bin/env python
"""
File: yx ODE.py
Copyright (c) 2016 <NAME>
License: MIT
Course: PHYS227
Assignment: C.3
Date: March 17
Email: <EMAIL>
Name: <NAME>
Description: Solves an ODE problem using the forward Euler method
"""
import numpy as np
import matplotlib.pyplot as plt
def Euler(f, xa, xb, ya, h):
ans = []
x_array = []
x = xa
x_array.append(x)
y = ya
ans.append(y)
while x < xb:
y += h * f(y)
ans.append(y)
x += h
x_array.append(x)
return ans, x_array
def func(x):
return 1/(2.0 *(x - 1))
def ans(x):
return 1 + np.sqrt(x+1e-3)
def run():
ans1 = Euler(func, 0, 4, 1 + np.sqrt(1e-3), 1)
ans2 = Euler(func, 0, 4, 1 + np.sqrt(1e-3), 0.25)
ans3 = Euler(func, 0, 4, 1 + np.sqrt(1e-3), 0.01)
x = np.linspace(0,4,1000)
y = ans(x)
plt.plot(ans1[1], ans1[0])
plt.plot(ans2[1], ans2[0])
plt.plot(ans3[1], ans3[0])
plt.plot(x, y) |
<gh_stars>0
/**
* @author: <<EMAIL>> <NAME>
* @class: AlertModel
* @version: 0.1.0
* @description: Alert Obejct Model.
* @exports: object
*/
export interface Alert {
id: number;
name: string;
type: string;
counter: number;
}
|
#<pycode_BC695(py_netnode)>
netnode.alt1st = netnode.altfirst
netnode.alt1st_idx8 = netnode.altfirst_idx8
netnode.altnxt = netnode.altnext
netnode.char1st = netnode.charfirst
netnode.char1st_idx8 = netnode.charfirst_idx8
netnode.charnxt = netnode.charnext
netnode.hash1st = netnode.hashfirst
netnode.hashnxt = netnode.hashnext
netnode.sup1st = netnode.supfirst
netnode.sup1st_idx8 = netnode.supfirst_idx8
netnode.supnxt = netnode.supnext
#</pycode_BC695(py_netnode)>
|
Metabolic engineering of Ralstonia eutropha for the production of polyhydroxyalkanoates from sucrose. A sucrose utilization pathway was established in Ralstonia eutropha NCIMB11599 and R. eutropha 437-540 by introducing the Mannheimia succiniciproducens MBEL55E sacC gene that encodes -fructofuranosidase. These engineered strains were examined for the production of poly(3-hydroxybutyrate) and poly(3-hydroxybutyrate-co-lactate) , respectively, from sucrose as a carbon source. It was found that -fructofuranosidase excreted into the culture medium could hydrolyze sucrose to glucose and fructose, which were efficiently used as carbon sources by recombinant R. eutropha strains. When R. eutropha NCIMB11599 expressing the sacC gene was cultured in nitrogen-free chemically defined medium containing 20g/L of sucrose, a high P(3HB) content of 73.2wt% could be obtained. In addition, R. eutropha 437-540 expressing the Pseudomonas sp. MBEL 6-19 phaC1437 gene and the Clostridium propionicum pct540 gene accumulated P(3HB-co-21.5mol% LA) to a polymer content of 19.5wt% from sucrose by the expression of the sacC gene and the Escherichia coli ldhA gene. The molecular weights of P(3HB) and P(3HB-co-21.5mol%LA) synthesized in R. eutropha using sucrose as a carbon source were 3.5210 (Mn ) and 2.1910 (Mn ), respectively. The engineered R. eutropha strains reported here will be useful for the production of polyhydroxyalkanoates (PHAs) from sucrose, one of the most abundant and relatively inexpensive carbon sources. |
<filename>apps/client/src/app/pages/allagan-reports/allagan-reports/allagan-reports.component.ts
import { ChangeDetectionStrategy, ChangeDetectorRef, Component } from '@angular/core';
import { AllaganReportsService } from '../allagan-reports.service';
import { NzModalService } from 'ng-zorro-antd/modal';
import { TranslateService } from '@ngx-translate/core';
import { distinctUntilChanged, filter, map, shareReplay } from 'rxjs/operators';
import { combineLatest } from 'rxjs';
import { AuthFacade } from '../../../+state/auth.facade';
import { AllaganReportQueueEntry } from '../model/allagan-report-queue-entry';
import { NzMessageService } from 'ng-zorro-antd/message';
import { AllaganReportStatus } from '../model/allagan-report-status';
import { AllaganReportSource } from '../model/allagan-report-source';
import { uniq } from 'lodash';
import { LazyDataFacade } from '../../../lazy-data/+state/lazy-data.facade';
@Component({
selector: 'app-allagan-reports',
templateUrl: './allagan-reports.component.html',
styleUrls: ['./allagan-reports.component.less'],
changeDetection: ChangeDetectionStrategy.OnPush
})
export class AllaganReportsComponent {
AllaganReportStatus = AllaganReportStatus;
AllaganReportSource = AllaganReportSource;
public reportSources = uniq(Object.keys(AllaganReportSource));
public applyingChange = false;
public dirty = false;
public selectCount = 0;
public queueStatus$ = this.allaganReportsService.getQueueStatus().pipe(
filter(() => !this.dirty),
map(rows => {
return rows.map(row => {
return {
...row,
selected: false
};
});
})
);
isDataChecker$ = this.authFacade.user$.pipe(
map(user => user.admin || user.moderator || user.allaganChecker),
distinctUntilChanged(),
shareReplay(1)
);
public status$ = combineLatest([
this.lazyData.getEntry('extracts'),
this.lazyData.getEntry('fishes'),
this.lazyData.getEntry('items'),
this.allaganReportsService.getDashboardData()
]).pipe(
map(([extracts, fishes, items, dashboardData]) => {
const fishWithNoData = fishes
.filter(itemId => {
return itemId < 200000 && !items[itemId].en.includes('Skybuilders')
&& extracts[itemId].sources.length === 0;
});
return {
reportsCount: dashboardData.reportsCount,
appliedReportsCount: dashboardData.appliedReportsCount,
fishCoverage: Math.floor(1000 * (fishes.length - fishWithNoData.length) / fishes.length) / 10,
fishWithNoData,
itemsWithNoSource: Object.values(extracts).filter(e => {
const enName = items[e.id].en;
const frName = items[e.id].fr;
return !fishWithNoData.includes(e.id)
&& !['Dated', 'Skybuilders'].some(ignored => enName.indexOf(ignored) > -1)
&& !/S\d{1,2}$/.test(frName) && enName.length > 0
&& e.sources.length === 0;
}).map(e => e.id).sort((a, b) => b - a)
};
})
);
public sourceFilter$ = this.allaganReportsService.filter$;
constructor(public allaganReportsService: AllaganReportsService,
private dialog: NzModalService, public translate: TranslateService,
private lazyData: LazyDataFacade, private authFacade: AuthFacade,
private message: NzMessageService, private cd: ChangeDetectorRef) {
}
public saveSourceFilter(sources: AllaganReportSource[]): void {
this.allaganReportsService.filter$.next(sources);
}
public onRowChecked($event: boolean): void {
this.dirty = true;
if ($event) {
this.selectCount++;
} else {
this.selectCount--;
}
}
public handleCheckboxClick($event: MouseEvent, rows: AllaganReportQueueEntry[], index: number): void {
if ($event.shiftKey) {
const slice = rows.slice(0, index).reverse();
for (const row of slice) {
if (row.selected) {
break;
}
if (row.type === AllaganReportStatus.PROPOSAL && row.source !== AllaganReportSource.FISHING && row.source !== AllaganReportSource.SPEARFISHING) {
row.selected = true;
this.selectCount++;
}
}
this.cd.detectChanges();
}
}
getColor(status: AllaganReportStatus): string {
switch (status) {
case AllaganReportStatus.ACCEPTED:
return 'darkgreen';
case AllaganReportStatus.DELETION:
return '#f50';
case AllaganReportStatus.MODIFICATION:
return '#f2b10e';
case AllaganReportStatus.PROPOSAL:
return '#108ee9';
}
}
acceptProposal(entry: AllaganReportQueueEntry): void {
this.dirty = false;
this.allaganReportsService.acceptProposal(entry).subscribe(() => {
this.message.success(this.translate.instant('ALLAGAN_REPORTS.Proposal_accepted'));
this.applyingChange = false;
this.cd.detectChanges();
});
}
rejectProposal(entry: AllaganReportQueueEntry): void {
this.dirty = false;
this.allaganReportsService.reject(entry).subscribe(() => {
this.message.success(this.translate.instant('ALLAGAN_REPORTS.Proposal_rejected'));
this.applyingChange = false;
this.cd.detectChanges();
});
}
acceptMany(entries: AllaganReportQueueEntry[]): void {
this.dirty = false;
this.allaganReportsService.acceptManyProposal(entries.filter(e => e.selected).map(e => {
delete e.selected;
return e;
})).subscribe(() => {
this.message.success(this.translate.instant('ALLAGAN_REPORTS.Proposal_accepted'));
this.applyingChange = false;
this.selectCount = 0;
this.cd.detectChanges();
});
}
rejectMany(entries: AllaganReportQueueEntry[]): void {
this.dirty = false;
this.allaganReportsService.rejectMany(entries.filter(e => e.selected).map(e => {
delete e.selected;
return e;
})).subscribe(() => {
this.message.success(this.translate.instant('ALLAGAN_REPORTS.Proposal_rejected'));
this.applyingChange = false;
this.selectCount = 0;
this.cd.detectChanges();
});
}
}
|
<filename>src/main/java/com/yuukaze/i18next/service/WindowManager.java<gh_stars>0
package com.yuukaze.i18next.service;
import com.intellij.openapi.wm.ToolWindow;
import com.yuukaze.i18next.ui.TableView;
public class WindowManager {
private static WindowManager INSTANCE;
private ToolWindow toolWindow;
private TableView tableView;
public static WindowManager getInstance() {
return INSTANCE == null ? INSTANCE = new WindowManager() : INSTANCE;
}
private WindowManager() {}
public void initialize(ToolWindow toolWindow, TableView tableView) {
this.toolWindow = toolWindow;
this.tableView = tableView;
}
public ToolWindow getToolWindow() {
return toolWindow;
}
public TableView getTableView() {
return tableView;
}
} |
The
Unschooling
Unmanual
Nanda Van Gestel
Jan Hunt
Daniel Quinn
Rue Kream
Earl Stevens
Kim Houssenloge
John Holt
Mary Van Doren
Edited by Jan and Jason Hunt
Book design by Jason Hunt
The Natural Child Project
naturalchild.org
Library and Archives Canada Cataloguing in Publication
The unschooling unmanual/Nanda van Gestel... [et al.] ; edited by Jan and Jason Hunt ; book design by Jason Hunt.
Includes bibliographical references.
ISBN 978-0-9685754-5-1 (bound)
Ebook ISBN 978-0-9685754-8-2
1. Home schooling. 2. Non-formal education. 3. Education—Parent participation. I. Van Gestel, Nanda, 1964- II. Hunt, Jan, 1942- III. Hunt, Jason, 1981-.
LC40.U58 2008 371.04′2 C2007-907656-4
First edition. © 2008 The Natural Child Project (naturalchild.org). All rights reserved. For reprint permission, orders, and other information, visit naturalchild.org/contact.
Front and back cover design and book layout by Jason Hunt. Cover photo of Jason, age 4, by Jan Hunt. Back cover photo of Koen and Jochem by Nanda Van Gestel.
"Schooling: The Hidden Agenda" was presented at the Houston Unschoolers Group Family Learning Conference, October 2000. © 2000 Daniel Quinn.
Jan Hunt's articles "The Natural Love of Learning" and "How Do We Know They're Learning?" are adapted from _The Natural Child: Parenting From the Heart._ © 2001.
Rue Kream's articles "Why Choose Unschooling?" and "What About College?" are reprinted from her book _Parenting a Free Child: An Unschooled Life._ © 2005.
"Every Waking Hour" is excerpted from _Learning All the Time_ by John Holt. © 2005. Reprinted by arrangement with Basic Books, a member of the Perseus Books Group (perseusbooks.com). All rights reserved.
"Mary's Memoirs" by Mary Van Doren originally appeared in _Growing Without Schooling_ in the 1980s. Reprinted with permission of Holt Associates/ _Growing Without Schooling_ (holtgws.com).
To order _The Unschooling Unmanual_ directly from the publishers, and for more information about this book, visit naturalchild.org/unmanual.
Dedicated to John Caldwell Holt
1923-1985
"Little children love the world. That is why they are so good at learning about it. For it is love, not tricks and techniques of thought, that lies at the heart of all true learning. Can we bring ourselves to let children learn and grow through that love?"
John Holt
# Contents
[Why Choose Unschooling?
Rue Kream](c01.htm)
[The Natural Love of Learning
Jan Hunt](c02.htm)
[An Unschooling Adventure
Nanda Van Gestel](c03.htm)
[Why I Chose Unschooling
Kim Houssenloge](c04.htm)
[Schooling: The Hidden Agenda
Daniel Quinn](c05.htm)
[How Do We Know They're Learning?
Jan Hunt](c06.htm)
[What is Unschooling?
Earl Stevens](c07.htm)
[Learning Through Play
Jan Hunt](c08.htm)
[What About College?
Rue Kream](c09.htm)
[Learning to Trust
Jan Hunt](c10.htm)
[Every Waking Hour
John Holt](c11.htm)
[Mary's Memoirs
Mary Van Doren](c02.htm#box1)
# Why Choose Unschooling?
by Rue Kream
"A person's freedom of learning is part of his freedom of thought, even more basic than his freedom of speech."
–John Holt
_Why did you choose unschooling rather than some other form of homeschooling?_
**I** always knew that the way "everybody" lived didn't feel right to me. I used to imagine that, when I grew up, I would live on an island with my family. From a very young age I struggled to understand what life really meant. As I grew, I came up with some answers. Life for me is truly feeling the earth underneath me and seeing the things around me. It is enjoying every moment with the people I love. It is making another person smile. It is thinking and dreaming, feeling pain and feeling joy.
When Dagny was a baby, I started to ask myself new questions. Does it matter if we know our multiplication tables? Is accumulation of knowledge the goal of life? Should there _be_ a goal of life? Why should we spend her childhood apart from each other when we both want so much to be together? Can we step off the well-worn path and find our own way?
When I learned that unschooling was a possibility, I was thrilled that we could continue to live as we had been since Dagny was born. I found the answers to my questions, which in reality I had known all along. Children belong with their families. Nothing is more important than living in connection with the ones you love and sharing life's experiences. We can't help but learn as we live full and interesting lives together. When we rejected the kind of life that comes with a road map, we were able to question what it was we wanted from our lives, and to determine what we do not want. We want joy. We want to know that we lived consciously and in the moment. We do not want to mold our children. We want them to have the freedom to choose their lives. We do not want to ever feel that we wasted time we could have spent together.
_Children belong with their families._
Our major reasons for unschooling have nothing to do with academics, but of course there are reasons we choose not to teach our children. We believe that children (humans) seek out knowledge in the same way they seek out fun or food, and we believe that adults can do a lot to interfere with that desire to learn. We don't believe that repetition is necessary or that there is a list of things that every person needs to know. We believe that turning the relationship of parent and child into a relationship between teacher and student is detrimental. We want our children to own their learning and to learn for their own reasons, not to please a teacher.
Jon and I have determined what it is we live by, what matters, and what does not. It has evolved and will continue to evolve as we face new challenges and joys in our lives. We want to choose the lives we lead, and we want our children to have the opportunity to do the same.
Ultimately I'd say that the reason we choose to unschool is because we want our children to be truly free.
© 2005 Rue Kream
"School always appeared to me like a prison, and I could never make up my mind to stay there, when the sunshine was inviting, the sea smooth, and when it was such a joy to run about in the free air, or to paddle around in the water."
Claude Monet
# The Natural Love of Learning
by Jan Hunt
**T** he main element in successful unschooling is trust. We trust our children to know when they are ready to learn and what they are interested in learning. We trust them to know how to go about learning. Parents commonly take this view of learning during the child's first two years, when he is learning to stand, walk, talk, and to perform many other important and difficult things, with little help from anyone. No one worries that a baby will be too lazy, uncooperative, or unmotivated to learn these things; it is simply assumed that every baby is born wanting to learn the things he needs to know in order to understand and to participate in the world around him. These one-and two-year-old experts teach us several principles of learning:
**Children are naturally curious and have a built-in desire to learn first-hand about the world around them.**
John Holt, in his book _How Children Learn,_ describes the natural learning style of young children:
"The child is curious. He wants to make sense out of things, find out how things work, gain competence and control over himself and his environment, and do what he can see other people doing. He is open, perceptive, and experimental. He does not merely observe the world around him. He does not shut himself off from the strange, complicated world around him, but tastes it, touches it, hefts it, bends it, breaks it. To find out how reality works, he works on it. He is bold. He is not afraid of making mistakes. And he is patient. He can tolerate an extraordinary amount of uncertainty, confusion, ignorance, and suspense.... School is not a place that gives much time, or opportunity, or reward, for this kind of thinking and learning."
**Children know best how to go about learning something.**
If left alone, children will know instinctively what method is best for them. Caring and observant parents soon learn that it is safe and appropriate to trust this knowledge. Such parents say to their baby, "Oh, that's interesting! You're learning how to crawl downstairs by facing backwards!" They do not say, "That's the wrong way." Perceptive parents are aware that there are many different ways to learn something, and they trust their children to know which ways are best for them.
**Children need plentiful amounts of quiet time to think.**
As John Holt noted in _Teach Your Own,_ "Children who are good at fantasizing are better both at learning about the world and at learning to cope with its surprises and disappointment. It isn't hard to see why this should be so. In fantasy we have a way of trying out situations, to get some feel of what they might be like, or how we might feel in them, without having to risk too much. It also gives us a way of coping with bad experiences, by letting us play and replay them in our mind until they have lost much of their power to hurt, or until we can make them come out in ways that leave us feeling less defeated and foolish."
But fantasy requires time, and time is the most endangered commodity in our lives. Fully-scheduled school hours and extracurricular activities leave little time for children to dream, to think, to invent solutions to problems, to cope with stressful experiences, or simply to fulfill the universal need for solitude and privacy.
**Children are not afraid to admit ignorance and to make mistakes.**
When Holt invited toddlers to play his cello, they would eagerly attempt to do so; schoolchildren and adults would invariably decline. Unschooling children, free from the intimidation of public embarrassment and failing marks, retain their openness to new exploration. Children learn by asking questions, not by answering them. Toddlers ask many questions, and so do school children — until about grade three. By that time, many of them have learned an unfortunate fact: that in school, it can be more important for self-protection to hide one's ignorance about a subject than to learn more about it, regardless of one's curiosity.
_Children learn by asking questions,
not by answering them._
**Children take joy in the intrinsic values of whatever they are learning.**
There is no need to motivate children through the use of extrinsic rewards, such as high grades or stars, which suggest to the child that the activity itself must be difficult or unpleasant; otherwise, why is a reward, which has nothing to do with the matter at hand, being offered? The wise parent says, "I think you'll enjoy this book", not "If you read this book, you'll get a cookie."
**Children learn best about getting along with other people through interaction with those of all ages.**
No parents would tell their baby, "You may only spend time with those children whose birthdays fall within six months of your own. Here's another two-year-old to play with."
In his book _Dumbing Us Down,_ New York State Teacher of the Year John Taylor Gatto states, "It is absurd and anti-life to be part of a system that compels you to sit in confinement with people of exactly the same age and social class. That system effectively cuts you off from the immense diversity of life and the synergy of variety; Indeed, it cuts you off from your own past and future, sealing you in a continuous present... ."
**Children learn best about the world through first-hand experience.**
No parent would tell her toddler, "Let's put that caterpillar down and get back to your book about caterpillars." Unschoolers learn directly about the world. Our son describes unschooling as "learning by doing instead of being taught", Ironically, the most common objection about unschooling is that children are "being deprived of the real world",
**Children need and deserve ample time with their family.**
Gatto cautions us, "Between schooling and television, all the time children have is eaten up. That's what has destroyed the American family." Many unschooling parents feel that family cohesiveness is perhaps the most meaningful benefit of the experience. Just as I saw his first step and heard his first word, I have the honor and privilege of sharing my son's world and thoughts. Over the years, I have discovered more from him about life, learning, and love, than from any other source. The topic we seem to be learning the most about is the nature of learning itself. I sometimes wonder who learns more in unschooling families, the parents or the children!
**Stress interferes with learning.**
As Albert Einstein wrote in his _Autobiographical Notes,_ "It is a very grave mistake to think that the enjoyment of seeing and searching can be promoted by means of coercion." When a one-year-old falls down while learning to walk, we say, "Good try! You'll catch on soon!" No caring parent would say, "Every baby your age should be walking. You'd better be walking by Friday!"
Most parents understand how difficult it is for their children to learn something when they are rushed, threatened, or given failing grades. John Holt warned that "we think badly, and even perceive badly, or not at all, when we are anxious or afraid... when we make children afraid, we stop learning dead in its tracks." While infants and toddlers teach us many principles of learning, schools have adopted quite different principles, due to the difficulties inherent in teaching a large number of same-age children in a compulsory setting. The structure of school — required attendance, school-selected topics and books, and constant checking of the child's progress — assumes that children are not natural learners, but must be compelled to learn through the efforts of others.
Natural learners do not need such a structure. The success of self-directed learning (unschoolers regularly outperform their schooled peers on measures of achievement, socialization, confidence, and self-esteem) strongly suggests that structured approaches inhibit both learning and personal development. Because unschooling follows principles of natural learning, children retain the curiosity, enthusiasm, and love of learning that every child has at birth.
Unschooling, as Holt writes in _How Children Learn, i_ s a matter of faith. "This faith is that by nature people are learning animals. Birds fly; fish swim; humans think and learn. Therefore, we do not need to motivate children into learning by wheedling, bribing, or bullying. We do not need to keep picking away at their minds to make sure they are learning. What we need to do — and all we need to do — is to give children as much help and guidance as they need and ask for, listen respectfully when they feel like talking, and then get out of the way. We can trust them to do the rest."
©2001 Jan Hunt
Mary's Memoirs by Mary Van Doren
A Way of Life
The whole idea of natural learning has evolved over the past few years from something we thought would be good for our children, to a way of life; part of a way of life, actually, which is itself still evolving for us.
I think freedom is the key to all of this — freedom to raise our children the best way we can, freedom for us and our children to learn and grow at the right time. With the children, I find more and more that we must let them do what they feel they need to do. We never know what will come from a particular activity.
One example pointed this out to me very clearly recently. Helen (3) suddenly started making doll beds everywhere, especially in the linen closet. As irritating as I occasionally found this, I didn't say anything about it and somehow was able to leave the sheets and pillowcases as they were, layered with dolls. Now, suddenly, Helen makes the other beds in the house. I feel that she had done some training for it on her own with the dolls, and we were all lucky that I didn't interfere. She doesn't do it every day, but then neither do I.
# An Unschooling Adventure
by Nanda Van Gestel
Our Family
Knowledge and Wisdom
Freedom
Mutual Respect
Attachment Parenting
A Healthy Balance
Natural Reading
The Arts
Math in Everyday Life
The Art of Playing
Taking Time
Friendship
Joy
Learning from Life
Dreams
Trust
"There is no difference between living and learning... it is impossible, and misleading, and harmful to think of them as being separate."
John Holt
## Our Family
**M** y name is Nanda. I am married to Hans, and together we have four boys: Rutger, Stijn, Jochem, and Koen. We are a Dutch family who left the Netherlands and moved to the U.S., and later Ireland, to be able to unschool our children. We are currently back in the Netherlands.
Our oldest son, Rutger, was born prematurely, and we nearly lost him. There were medical complications, and it was three months before we could bring him home.
Rutger has been called a "special needs" child. We don't like to label anyone — all of our boys are special. My heart told me that if we wanted to make Rutger happy, we needed to focus on his strengths and love him unconditionally. We found that not only is he loving and sensitive, he is also intelligent. We helped him to follow his own interests, and he enriched and deepened our lives in ways we never thought possible. Most of all, he was a happy child.
When Rutger entered school at age five, all of this changed. In just a few months, our bright, confident son had turned into a scared and unhappy child. In school, he couldn't pursue his own interests, and because he wasn't challenged by what was going on in the classroom, he would escape into his own inner world. The teachers responded to this by putting more and more pressure on him — and on me. I spent many hours talking with them, but it didn't help. It became clear to me that they expected children to submit to the school system and sacrifice their own interests, even if that would break their spirit.
I wished with all my heart that we could take care of Rutger's education ourselves. Seeing my child suffer gave me the courage to follow my heart and keep him home; unfortunately, school attendance was mandatory in the Netherlands. We had no idea what would happen next, when the solution came as if by magic — my husband was offered a job in the U.S. On the Internet we learned that homeschooling is legal in all 50 states.
We didn't need long to make up our minds; we sold everything and moved. It was such an eye-opener to us that what is illegal in one country is fully accepted and considered a human right in another. In the Netherlands, people thought of us as irresponsible for wanting to take Rutger out of school; in the U.S. we were admired for taking responsibility for his education. The change we saw in him after our move was almost unbelievable, as if a heavy burden had been lifted off his shoulders. He finally had the freedom to be himself.
_Unschooling is more than an
education — it's life._
From curriculum-based homeschooling we grew toward unschooling, in which we trust that our children know what they need to learn and when they need to learn it. As George Bernard Shaw wrote, "What we want to see is the child in pursuit of knowledge, not knowledge in pursuit of the child."
It's wonderful to see our children's unique gifts and abilities, and to watch them grow and learn together. Hans and I have learned more in the last several years than ever before, due to all the different topics our children bring up, and all the questions they ask.
Unschooling is more than an education — it's life. In natural learning, everything is connected. Our children have gone from classical music to art, architecture, and ancient Rome, and from there to philosophy, Plato, Pythagoras and mathematics. In this way things make sense, and there is no end to what they can learn; one interest leads into another. Just as babies and toddlers constantly explore the world around them, never growing tired of it, so will unschooling children continue to satisfy their natural curiosity and thirst for knowledge.
As for Rutger, he is doing great — he loves to read and explore, and keeps amazing us with all the things he has learned. He is happy again, and feels proud and confident. Unschooling has liberated us all in many wonderful ways.
## Knowledge and Wisdom
**W** hile unschooling is natural for children, parents may need to rediscover how easy and joyful learning can be. This didn't come naturally to me at first. Because I went to school myself, part of me still thought that you could only learn if you sat behind a desk and listened to a teacher. Then, when you've gained the teacher's approval in the form of a good grade, you've learned something. But if that were true, how can it be that I don't remember the things I learned this way? And how can it be that the experiences that have really taught me and helped me to grow as a person took place outside of school, in real life?
My father has always said there is a big difference between knowledge and wisdom, and I think he's right. Wisdom is something that we find through life experience; it can't be taught in a classroom. As unschoolers, we live our life and learn from every situation we encounter.
Every day our children learn more about what they can do, what they believe in, and what makes them happy. That's wisdom. Knowledge comes to them because they are curious about the world they live in, because they love to read and to explore, and because they encounter everyone and everything with an open mind and heart.
## Freedom
**W** hen we do something by choice, we can be creative about it and give it our very best effort. If we choose to do something, we can enjoy it. Feeling coerced to do something is a sure way to take the fun out of it!
The school system is based on making children do things, molding and conditioning them to behave in certain ways. Children "have to" go to school, they "have to" stay in the classroom and they "have to" listen to the teacher. This approach interferes with learning, because we learn best what we are most interested in at the moment. The desire to learn must come from within; forcing children to listen won't make them learn. The use of tests and grades compels students to memorize, at least temporarily, what the teacher has told them. This way it may seem as though the students are learning; in fact they're only learning how to take tests.
_The desire to learn must come from within._
Rutger resists pressure even more than I do, and when he attended school, it literally made him ill. The best part of unschooling is that our children are free to learn what they want to, when they want to, and how they want to. We don't have a system; there is no mold that our children have to fit. Instead, unschooling is a celebration of each child's unique personality and abilities. To Rutger, unschooling meant finally being free to be himself.
Rutger wasn't the only one who felt liberated — we all felt free to make our own choices. When Rutger went to school, we all had to fit into the same schedule. I had to get up at a certain time, get my son ready and take him to school. We could no longer go on outings or vacations when we felt like it, and we couldn't accompany my husband on business trips. It was very clear that we all had to adapt to the system. Having Rutger in school drained our energy and put a lot of pressure on our whole family. Now that we're Unschooling, we're free to let the day progress naturally, and that makes all of our lives much easier.
## Mutual Respect
**S** ome people think that a child's cooperation is something adults are entitled to; they think it is something they can demand. But genuine cooperation cannot be demanded — It can only be earned, and must be given freely. When children feel respected, they want to cooperate.
Children who are raised with trust and respect feel free to express their needs and opinions. When we want our children to do something, or to stop doing something, we don't tell them that they "have to" or "aren't allowed to", Instead, we express our needs and feelings, and listen to theirs, just as we would with an adult friend.
A while ago, Hans visited the home of a colleague who raises his children in the conventional, authoritarian way. My husband was saddened to see how his colleague's children were "seen but not heard", What a difference from a typical day at our house!
_When children feel respected, they want to cooperate._
It can be quite lively and busy at our place, with four boys playing, learning and talking all the time. We really value our children's opinions and ideas, and we love talking with them about different subjects. We find it fascinating to hear what they think and feel. They all have their own opinions, and would find it strange not to be allowed to express them. The question we should ask ourselves is: What does our society need most — people who will always do as they are asked, or independent, creative thinkers? People who have been dominated from an early age learn to dominate others when they get the chance, while people who are happy about who they are and have self-respect, will respect others.
## Attachment Parenting
**W** hen I was pregnant and announced that I would quit my job after our baby was born, a lot of people didn't understand. They would ask, "Aren't you afraid you won't be able to talk about anything but booties and diapers?" I think it all depends on our perspective. If we're only aware of the day-to-day routine, and think that parenthood is all about changing diapers and making sure our children are dressed and well-fed, then it could indeed become mind-numbing. When we realize that we're also responsible for our child's emotional well-being, parenting becomes a lot more interesting, and our own capacity for love and compassion grows.
_Unschooling allows each child to take
their own unique learning path._
When we announced that we wanted to take care of our children's education ourselves, the reactions were often skeptical: "That's a huge responsibility; I would never dare to do it." Or, "How can children learn when they don't go to school?" As with parenting, it all depends on our own personal awareness. We had already experienced what attachment parenting could do for our children and our entire family, and now we wanted to expand that philosophy to their education. When Rutger could no longer cope with the school system, it gave us a reason to start unschooling. However, the underlying philosophy behind our decision was our holistic approach to parenting and education. Unschooling allows each child to take their own unique learning path. By following their hearts and pursuing their own interests, they learn to take responsibility for their personal growth. Hans and I are there to encourage and assist.
Of course, there are alternative schools that take a holistic approach to education, but they are still schools. Unschooling is like a home-cooked meal, prepared with love, that includes our child's favorite foods, while school is more like a prepackaged meal which may or may not match our child's preferences. No teacher can possibly take each child's unique interests into account, and all schools separate parents from children and siblings from each other. To me, the most important aspect of both parenting and education is the ability to see the loving essence within each child. No one is better at that than a loving parent.
## A Healthy Balance
**W** hen Rutger attended school, he would come home with one virus infection after another. By the time he was better, the next illness was going around, and we were all sick more often than not. The whole family had trouble staying balanced because we all had to hurry, hurry, hurry just to keep up with the school schedule. And I believe that we are all more prone to illness when we're emotionally out of balance.
Now that we're unschooling, we can set our own schedules. We're free to listen to the subtle signals our bodies are sending us, those "inner voices" that tell us how to stay mentally, emotionally and physically fit. Illnesses like the flu and colds remind us that we need to slow down. When our whole family was home recovering after a busy period of trying to keep up with everything, it would take time to regain our inner balance. Now that we're unschooling we seldom reach that point, because we can slow down and rest when we need to. In fact we've hardly ever been sick since we started unschooling.
_We can slow down and rest
when we need to._
It's not so long ago that doctors told us we should feed our babies on a strict schedule, instead of on request. Most mothers now understand that infants know best when they should be fed. Babies know perfectly well when they're hungry, and they always let us know! Why not take it a step further and trust children to know how to go about learning? Children know when they're hungry for information, and they also know when they need processing time.
I used to feel a bit guilty whenever we would spend a rainy day relaxing on the couch, watching movies. After a while I realized that a few of those "nice and easy" days would give us new energy. It's natural for children to listen to the signals of their bodies and to live according to their own rhythms. If we encourage them to trust these feelings, they will be happy and healthy, and will learn in the most natural way.
## Natural Reading
**I** grew up in a family where everyone loved books, and I was often read to. By reading a good book, we can look inside someone's mind and heart. I have always found that the right book will come into my life at the exact right time, but in school I had to read books that someone else had chosen, and it almost destroyed my love for reading. Because they like to read, our children have learned that there can be many different opinions about a subject. I think that's wonderful, because they don't assume that there is only one way of looking at things — as they might if they were in school. Instead they learn to think for themselves, take all the Information they've gathered into account, and form their own opinions.
We visit our local library at least twice a week, and the boys bring long wish lists of books, software and videos they want to check out. Libraries are an unschooler's paradise — we can come and go as we please, there are no tests, and we come home with bags full of books and other materials of our own choosing. In the winter, we sit down to read in front of the fire and drink hot cocoa; in summer, we read in the shade of a big tree with some cool lemonade. The older children often read by themselves, and they also enjoy reading to the younger ones.
_The best way to teach
children how to read is simply
by reading to them._
The best way to teach children how to read is simply by reading to them — as often as we can. When they're ready to read they'll take off on their own. Children can be ready to read at different ages. Rutger was reading by himself at age five; books were his main interest. Stijn didn't read until he was eight, but when he was ready, he learned quickly.
While pressuring a child to read at a certain age may "work", it can create negative associations and jeopardize his future interest in reading. Until a child reads by himself, we can best help him by reading to him without expecting or pushing him to read on his own. In this way we can give our child a lifelong love of reading.
## The Arts
**I** love to draw, but in school art classes we couldn't choose the subject. I remember the time when I had to illustrate a bicycle in great detail. It was so uninteresting to me that it almost made me dislike art. What I liked to draw — and still do — are horses. In art class we didn't draw horses, so I drew them in secret during other lessons. When I look through my old school workbooks, I find many, many sketches of horses, and not much else.
_I drew horses in secret._
A while ago one of Stijn's friends invited him to join her after-school art class. Stijn loves drawing and painting, but when I asked if he wanted to go to the class, he decided he would rather keep painting in his own way, so instead we bought new paint and brushes.
While art classes can help us learn new techniques, those are best learned when the child is ready and interested. What is more important is having the freedom to create straight from the heart, without restrictions or judgment. Self-taught people in many fields often produce fresher, more unique, and more creative work than those who have been taught specialized techniques and methods. Nobody has ever told Stijn how you should or shouldn't paint, so he paints from the heart, without being limited by unquestioned rules or traditions.
I remember taking a field trip to an art museum when I was in school. During our visit, we were regularly quizzed, which interfered with our natural exploration and enjoyment. What a difference from the way our children experience a museum visit! While Rutger doesn't draw or paint much, he's a great art lover. He's read many books on famous painters and knows all about their work. He's the one in our family who is most excited when we visit a museum. Whether creating art or appreciating it, we will find the greatest fulfillment when we do it with our heart and soul.
## Math in Everyday Life
When we first started Unschooling, I soon discovered that children learn best if we let them follow their own interests. I had my doubts about math, though. I couldn't believe that children would find math interesting or fun, because I had never really enjoyed math myself. How wrong I was!
When I tried to teach my children math, they didn't like it either. It didn't matter how much I tried to make it fun. They didn't like it because they couldn't see how math would be of use in their lives. When I stopped trying to teach them, my children showed me that math is everywhere around us. It becomes much more interesting once we stop seeing it as a separate subject, and recognize that it's a part of everyday life.
One night all four boys were gathered around the computer while Jochem was playing a game of Lego Racers. Stijn showed me a paper with lots of numbers written on it. "Look Mom," he said, "we're having a Lego Racers competition. Everyone gets to race three times, and I write down the time of each race. Then I'll add them all up, divide by three, and we'll know who the winner is!"
_Math is everywhere around us._
Our boys also love to play board games, especially Monopoly. We play it a lot, and because they enjoy it so much, they're eager to learn the math. For example, players can choose whether to pay a fixed amount of tax or ten per cent of their assets, so the boys wanted to learn all about percentages. This led to an interesting discussion about whether they would rather have a percentage of their dad's salary or the fixed allowance they were getting.
Jochem is the builder in our family. He spends hours creating the most beautiful Lego houses. Koen likes to help his big brother find the Lego bricks he needs in the big bin. "Koen, can you find me a thin white brick with twelve dots?" I heard Jochem ask. "I've got two with six dots for you," Koen answered, "that will work just fine too, won't it?"
Rutger didn't like math in school, but he loves playing the Zoombini computer games. I've played them myself and found them rather difficult. There's pattern finding, problem solving, sorting, graphing and mapping — all to guide the cute little Zoombini characters back to their home. Rutger doesn't find it difficult at all — it's just fun. "Am I really doing math now?" he asked, surprised. "Then I might like math after all."
## The Art of Playing
**J** ochem once said to me, "Mom, did you know that schoolchildren can't play?" When I asked him what he meant, he explained, "Well, whenever we have schoolchildren over, I notice that they run and shout a lot, but they never seem to really play. I think it's because they haven't had much time to themselves, and that's very sad." I was impressed by Jochem's observations, and I've noticed the same pattern myself.
_Play is a way of making sense of the world._
Young children have so many commitments and appointments nowadays. How can they find the time and peace of mind to play? No wonder so many of them run around and shout when they have a free moment! I can only agree with Jochem that it's very sad.
Jochem and Koen often play together until they go to bed, and when they wake up the next morning they just continue right where they left off. During the summer, they love to play outside making roads in the sand and driving toy cars over them, or making tiny houses from sticks and leaves. Their little toy characters have all sorts of adventures and "talk" in tiny little voices. In winter, they play the same way inside, with Lego bricks and wooden toys.
One day, I heard screaming upstairs and hurried to see what was wrong. I was sure the boys were fighting. When I entered the room and asked what the matter was, they both looked surprised, and started laughing. "No, Mom," they said, "we weren't fighting — these two Lego men are having an argument." I sat down and watched them help the Lego men settle their differences. At that moment I realized that playing is an art. Play is a way of making sense of the world, an opportunity to practice life's many challenges on a smaller scale.
## Taking Time
**R** utger once said, "I don't understand how children can feel bored, because there are always new books to read." I believe that boredom is a side-effect of the conventional school approach. In school, children are taught _not_ to do what they feel like, and _not_ to act on a sudden creative impulse or idea. Instead, they are expected to just sit and listen. Then, on weekends and school vacations, they can feel overwhelmed by the large amount of time suddenly available to spend on things they actually like. They might not even remember what most interested them.
Boredom is unknown to a baby or toddler. At that age, children are fascinated with the world around them; they are naturally curious about everything. Unschoolers retain this love of learning, and their natural curiosity will continue to flourish as they grow.
I think it's important to understand that all children may seem bored at times. But if we offer them an activity the minute they show the first sign of boredom, we don't give them the chance to discover what they really want to do. Processing time is just as necessary as active, productive time. If children can be trusted to take the time they need, they will eventually discover what interests them and gives them joy.
It often happens in our family that in moments of inactivity, the most creative and exciting new ideas are born. We have active periods in which we go on trips, work on projects and learn about many new things. We also have quieter periods when we stay mostly at home and think about all we've learned, taking time to process it. Both the active and inactive periods are an essential part of learning.
## Friendship
**M** any people understand that children can learn reading and writing at home, but they worry about socialization. They wonder if unschooling children interact well with others, if they have enough friends, or if they ever feel lonely.
There are many ways in which unschooling children can make friends. Local unschooling support groups offer regular get-togethers and outings. There are opportunities in most communities to meet others sharing similar interests, such as music, dance, sports, and theater. Volunteer work and apprenticeships are often available to match a specific Interest. Many families find like-minded friends through attachment parenting and unschooling support groups. And if parents can't find the right group for their family, they can always start their own.
In school, there can be a lot of peer pressure, and because of that, schoolchildren find it important to fit in. Our children don't have this need to conform because they have a strong sense of who they are and what is important to them. They don't feel the need to be accepted by others; they have learned to accept themselves.
It's also important for children to have the chance to spend time alone if they want to. Our boys are very close and love to play and learn together, but they also enjoy playing by themselves. They are excited when other children visit, and they are comfortable alone. There's so much focus on socialization nowadays that we can forget how important it is to have time to oneself. As Thich Nhat Hanh wrote in _A Pebble for Your Pocket,_ "Each of you has a hermitage to go to Inside — a place to take refuge and breathe. But this does not mean that you are cutting yourself off from the world. It means that you are getting more in touch with yourself."
_Unschooling children can be friends
with people of all ages._
Unschooling children are free to choose the amount of time they spend with others, as well as the way that time is spent. One of the greatest benefits of unschooling is that children aren't confined to a small group of classmates for friendship, and can be friends with people of all ages.
## Joy
**T** hese days children go to preschool earlier than ever. I remember people asking us if we had already enrolled Rutger for preschool when he was one year old! "Wouldn't it be great to have some time to yourself?" they would ask.
I'm not surprised that mothers might long for their children to attend school so they can have some peace and quiet. I often see parents busy correcting their child's behavior. That must be pretty tiring! If we can look at our children in a different way, and really enjoy their company, then parenting becomes a pleasure.
One day we had planned to go grocery shopping, but the sun was shining brightly and we decided to go to a park instead. We chose one that has a lake with a nice little sandy beach. We had a picnic lunch, and then we played in the sand, making sandcastles and baking sand "cakes", And when Rutger dropped a ball in the lake, we of course all had to take off our shoes and wade into the water! When we were finished playing, our clothes were a bit wet and we were all covered in sand, but we sure had a wonderful and relaxing afternoon.
_Unschooling gives us the opportunity to
truly enjoy life with our children._
We still needed groceries, so we stopped at the store on our way home. "Homeschoolers, I suppose?" the lady at the checkout asked. "I can always tell." I was surprised, but when I looked around I could see that we were the only ones with sandy shoes, wet clothes, red cheeks and happy expressions on our faces, on a regular weekday in October.
One of the best advantages of being a parent is that it gives us a great excuse to play! We can build with Lego, make a sandcastle, and have fun at the playground — all in the name of parenthood. If we've forgotten how to play, our children will be happy to remind us. Unschooling gives us the opportunity to truly enjoy life with our children.
## Learning from Life
**O** ne winter night, our cat died. While that was a sad event, in a way it was also beautiful. She had lived to be 23 — quite a long life for a cat. During the last few months of her life, it became obvious to us all that she wouldn't be with us much longer. Her death didn't come as a surprise, and the ending was very natural. During her last day, we saw her condition quickly worsen. At first I wanted to jump in the car and go to the vet to have it over with as soon as possible. She wasn't in pain though, and the boys wanted her to be at home, on her pillow in front of the fire, surrounded by all of us. They suggested that we all watch "The AristoCats" to honor her life.
It turned out to be a very special night, with our cat the center of attention. During the movie they all sat with her and said their goodbyes. When the movie was over, our cat took her last breath. She lay on her pillow, surrounded by candles and freshly picked flowers. Every now and then, one of the boys would look at her or touch her, and when we all felt ready, we buried her in the garden. That night, we read a book about the nine lives of a cat.
I remember when I skipped school one morning with a friend of mine because her pony was giving birth. I can still clearly remember how we watched from a corner of the stall while the veterinarian tried to save the pony after the foal was stillborn. I don't remember what lessons we had in school the rest of that day, but I do remember that I had many questions, and that none of them were answered.
Some years ago, one of our horses had a foal, and the boys were present at that magical occasion. At times like that, I'm so happy that our children unschool. We can take all the time we need to experience life fully and learn from these special moments.
Our third son, Jochem, was born while his oldest brother, Rutger, was still attending school. I remember how happy I was that Jochem was born during the Christmas holidays. If he had been born a few weeks earlier or later, Rutger would not have been able to fully enjoy that wondrous time with his newborn brother. Now it's one of his most precious memories.
## Dreams
**C** hildren learn much more from our actions than from our words. Instead of telling a child to say "please" and "thank you", it's simpler and more respectful to be polite to _them._ True kindness grows within a child only when they are treated with kindness. In the same way, if children see adults reading and learning, their natural curiosity will be nurtured. And by pursuing our dreams, we can inspire our children to follow theirs.
When my husband Hans was a child, he knew he wanted to be a saxophone player. But everyone told him that it wasn't possible to make a living that way, so he gave up on his dream, and chose a business career instead. At age 40, he realized that he feels happiest on stage playing music, and we decided to make his dream a reality.
_Unschooling allows children to discover their gifts._
Our U.S. visa was about to expire, so we sold our farm. We had to return to Europe, but we could choose where to settle, and we decided on that as a family. We took a cottage in Ireland so my husband could get out of the "fast lane", and do what he likes best.
Before leaving the U.S., we all made a list of the things that we value most in life, and we found those things in Ireland. We wanted to live surrounded by nature and have animals to care for. Our house in Ireland was bordered by a forest and a stream, and came with a resident pony and cats! It took a lot of courage, but we showed our children that change can be a good thing.
While living in Ireland, we traveled through Europe for several months, and learned many new things. The boys played cavemen after visiting prehistoric sites and became knights after visiting castles. They tasted new foods and heard new languages; made new friends and rekindled old friendships. They learned that they can be happy anywhere in the world, and that they don't have to be afraid of change. Later, when our boys felt the need to be closer to our extended family and to learn more about their roots, we had an opportunity to move back to the Netherlands (fortunately, regulations have eased somewhat, and families can now request an exemption from school based on holistic parenting beliefs).
I think we all know deep inside what makes us who we are — what our "mission" in life is. We all carry beautiful gifts to share with the world. Unschooling allows children to discover their gifts, and learn whatever they need to learn to have a fulfilling life.
## Trust
**I** t's not always easy to trust that our children will learn everything they need to know when they're ready. I sometimes wish I could go back in time and tell the younger me that everything will be all right! Just like every other unschooling parent, I've had my fears and doubts.
In those first years of unschooling, a phone call from one of my friends in Holland — whose children attended school — was enough to make me panic. When I heard about all of her children's school work, I sometimes worried that we were falling "behind schedule",
_Unschooling is based on trust._
Unschooling is based on trust. But whenever I started worrying, I found it hard to hold on to that trust. Out of fear, I would set everyone at the table to practice writing and math. It wasn't that I didn't trust _them,_ but that I didn't trust the process. What If I'd made the wrong decision? I was doing what my heart was telling me to do, but what if my heart was wrong? The real problem was that I didn't unschool when I was a child. I had never been trusted to learn naturally, to know intuitively what was right for me, so how could I have that kind of trust now?
I've had many moments like that one, especially when we first started unschooling. Each time, I had to overcome my fears and set my doubts aside to be able to trust freely again. While it wasn't always easy, it's been well worth the effort. Our trust in each other and in ourselves is now so strong that fear and worry hardly stand a chance.
It helped me to read about the experiences of other unschooling families and to discover like-minded friends through online and local support groups. But what helped most of all was the unconditional trust my children placed in me. "She'll be all right in a minute", I overheard Rutger tell Stijn at one of those moments. "Oh, yes, I know," Stijn answered, "she's just a bit scared right now." The moment he said that, my fear melted away. I realized that it didn't matter if my children learned on the same schedule as those of my friend. All that mattered was that we trusted them to learn on their own schedule. By meeting their needs and learning to trust, we have discovered to our delight that unschooling is simply living life, naturally and joyfully.
© 2001 Nanda Van Gestel
Mary's Memoirs by Mary Van Doren
A Mother's Helper
Though housework is still pretty overwhelming for me, de-junking helps, and almost by mistake I've come across a real gold mine: a mother's helper. This is something I had thought about in the past but had never done anything about.
By chance I met a very eager 11-year-old girl who lives nearby. Today was her first day here. In five hours, Amy vacuumed, washed dishes (three times), folded laundry (twice), made granola, helped put things away, and played with Helen (2½). I was able to do several loads of laundry and hang them out, sweep and wash the kitchen floor, put away laundry, and do some proofreading for _Growing Without Schooling._ We both stopped working as necessary to be with the children, but Greta (3 mos.) was mostly happy to watch, and Helen was thrilled to have a new friend around.
Amy knows we can't afford to pay her very much, but is very excited to be earning some spending money. With the good start we got today I probably won't need her too often, but she's always welcome to come and play with the children.
# Why I Chose Unschooling
by Kim Houssenloge
**I** t all started when I first had my sweet, precious little bundle of joy. Three weeks prior to his birth I was a teacher in a state primary school. I enjoyed my job and thought that I'd return after my baby's birth, at some point. Once Lewi entered the world, however, my thoughts drastically changed. I couldn't Imagine handing my precious little bundle over to anyone else. Surely no one could love him as I did?
From very early on I thought about Lewi's education. As time went on I realized that I couldn't just hand him over to any old school at the age of five.
By the time he was three, I started seriously thinking about where on earth I could send him to school. I looked into all of the local state schools and realized that I no longer had the same view of the education system that I'd had only a few years before. At this time I was also doing some private tutoring (which I'd been doing for years as a teacher). I felt the need to stop as I didn't really feel I could reach the kids I was trying to help. They Improved in terms of the system's demands upon them but they weren't developing the love of learning and passion for knowledge that my three-year-old boy had. What was wrong? I wasn't sure at the time. I now know.
I looked into Montessori and its approach to learning. I found elements of this that appealed to me, and pursued this option. I visited the school, I went to its open days, I met the teachers, I questioned them all. Something didn't feel right there for me.
My search for the best school for Lewi continued. I started looking into homeschooling as an option. It felt good to me in many ways, but it was a relatively new concept. As a teacher, I felt homeschooling to be a strange choice for parents to make. (I take that all back now!) Due to the negative feedback I got whenever I mentioned my thoughts about possibly homeschooling Lewi, and also the amount of unanswered questions I had about the whole concept of not going to school, I put the idea aside and continued on my search for the right school.
By the time Lewi had turned four I was agonizing over whether or not to send him to preschool. In my heart it felt wrong, but all his friends were enrolling and he said he wanted to go. After many conversations with the staff at the local preschool, and due to my thoughts that he'd probably be going to school the following year anyway, I reluctantly sent him along. He loved it. He had lots of fun and developed a lovely bond with his teacher.
_There was nothing in need of change.
He was already living life to the fullest._
At about the same time I began looking into the Waldorf Steiner philosophy for learning. I went to open days, I talked to teachers, I spoke to parents of children already at the school. I spoke to friends who were going to send their children there. I surfed the net looking for information. Although there were elements of the approach that I really liked, it was still a system of learning. It was a school situation with lots of children, where everyone had to do similar things at similar times every day — 6 hours a day, five days a week.
During my research into the Steiner approach, I began looking again into homeschooling as an option. It was then that I came upon Unschooling and natural learning. I became intrigued by the philosophy that children learn best when they are given the freedom to choose their own learning for themselves. I learned more about the nature of learning in a few short months than I ever did as a student in the school system and later on as a university student. Giving children the opportunities to self-direct their own learning and self-regulate their lives was a new concept to me. But looking at Lewi's life and realizing that he was learning all he needed to learn right then and there, regardless of a school system, felt empowering and wonderful. This is what I felt was missing in those years of my teaching career. No wonder students needed so many incentives and rewards to keep them going! They were learning, but they were learning what I wanted them to learn (or the Department of Education wanted them to learn). They weren't learning what was important to them. They were very rarely given the freedom of choice. They weren't able to dream and devise and hope and discuss their own paths. Their paths were chosen and that was that. The more I looked into natural learning, the more I loved it. This was what I'd been searching for. Lewi had been natural learning all of his little life. It felt right.
Looking at the results of natural learning in Lewi's first four years of life, I could see a passionate little boy who had an all-consuming thirst for knowledge and learning. He was a lover of books. A lover of nature. A fanatic about anything he was interested in at the time. He was motivated and self-directed and loved to play. There was nothing, the unschooling approach explained, in need of change. Nothing needed implementing. Nothing needed to happen to Lewi at the age of five for him to suddenly switch on to learning — he was already there, doing it, living life to the fullest.
I had finally reached the place that I needed to get to make the best decision for Lewi's education. So, after five days of preschool, I pulled him out. Much to my family and friends' surprise and some disapproval, I had made the absolute best, heart decision I'd ever made. It felt right. It felt normal. It felt peaceful for Lewi to just stay at home and not enter a system to be institutionalized, to stay at home and keep doing what we'd always done.
That year I read and read and read. I grabbed at anything to do with unschooling and natural learning. I joined discussion groups left right and center. I printed out reams and reams of fantastic articles on all sorts of related topics. I bought lots of good books form great authors on this approach to learning. I found John Holt and John Taylor Gatto. I learned about how children learn. I learned about how they fail. I learned the most I've ever learned about learning and the education system in that single year. I felt armed and ready.
By the time Lewi turned five, I knew I'd have some explaining to do. The questions poured in at me from all angles: Why would you choose to homeschool? Won't he get bored? What about socialization? What about you, how will you get a break? How is he going to function normally? How will he make friends? How will he fit into society? Tell me you're not going to do this for the high school years? What about university — aren't you depriving him?
At the time I had some answers — now I think I have most of them. It was a daunting time. I felt a real lack of support. So I decided to make a concerted effort to find some like-minded people. I knew that both Lewi and I would need this type of support and social outlet in our lives. I phoned around searching for anyone in our local area who homeschooled. To my relief I found some. On making the initial contacts and attempting to get some get-togethers happening, however, it felt as though regular contact was not going to be possible. I started to feel despondent and concerned that we wouldn't have the support I'd really hoped for.
One day, this all changed for the better. On arriving home from an outing, there was a message on my answering machine. It was a local family trying to make contact with as many homeschoolers as possible. They wanted to homeschool and asked if we would all like to get together and meet to discuss homeschooling. We all turned out to be natural learners! Who would've thought? This was the beginning of a wonderful, wonderful group. We now meet once a month and have great raves about learning and our children. We also meet with other homeschoolers once a month and have fun outings together.
Life for us is great. No hurried mornings trying to get to school on time, no "I don't want to go to school" comments, no "I don't want to do homework", no bullying, no tired and cranky child at the end of the day (well, not most days at least). Our days are spent enjoying life. Lewi is free to choose whatever it is he'd like to do. There are no schedules to follow. No deadlines to meet. No changing of topics when he's right in the middle of something fun or important to him. No pushing him to do something he's finding too hard or boring. No having to stop when a bell rings. No having to ask to go to the toilet. No waiting to eat even when you're starving. No lining up. No hands up to talk. No staying in late. No detentions for talking in class — actually, talking is encouraged! Lots of time to play and dream; laugh and run; swim and ride; read and listen; and talk, talk, talk. He gets to experience real life with real people. He's learning to interact with the world safely and confidently and with room to grow and change in a natural way.
_He loves his life and he loves learning._
He's learned to read and he's developing his writing. He loves most things to do with numbers. He's telling the time. He loves to draw, and paint and make things. He loves to construct and build. He's passionate about the natural world and the sciences. He's enthused about the history relevant to his interests. He's confident on the computer and can surf the Internet. He loves riding his bike and swimming and exploring. He likes to kick a ball around and have a game of cricket. He loves playing with his friends and having fun. He loves to be outside and explore nature. He loves delving into his imagination and making up fantastical stories, characters and worlds. He's doing all this freely and in a self-directed way. His learning is his own.
And when it all boils down, he's just a normal child doing normal things. He loves his life and he loves learning. He's happy and content. He loves this way of life. What more could anyone want for him?
©2005 Kim Houssenloge
"It is nothing short of a miracle that the modern methods of instruction have not yet entirely strangled the holy curiosity of inquiry; for this delicate little plant, aside from stimulation, stands mainly in need of freedom; without this it goes to wrack and ruin without fail."
Albert Einstein
# Schooling: The Hidden Agenda
by Daniel Quinn
A talk given at the Houston Unschoolers Group
Family Learning Conference, October 6 & 7, 2000.
**I** suspect that not everyone in this audience knows who I am or why I've been invited to speak to you today. After all, I've never written a book or even an article about homeschooling or Unschooling. I've been called a number of things: a futurist, a planetary philosopher, an anthropologist from Mars. Recently I was introduced to an audience as a cultural critic, and I think this probably says it best. As you'll see, in my talk to you today, I will be trying to place schooling and Unschooling in the larger context of our cultural history and that of our species as well.
For those of you who are unfamiliar with my work, I should begin by explaining what I mean by "our culture", Rather than burden you with a definition, I'll give you a simple test that you can use wherever you go in the world. If the food in that part of the world is under lock and key, and the people who live there have to work to get it, then you're among people of our culture. If you happen to be in a jungle in the interior of Brazil or New Guinea, however, you'll find that the food is not under lock and key. It's simply out there for the taking, and anyone who wants some can just go and get it. The people who live in these areas, often called aboriginals, stone-age peoples, or tribal peoples clearly belong to a culture radically different from our own.
I first began to focus my attention on the peculiarities of our own culture in the early 1960s, when I went to work for what was then a cutting-edge publisher of educational materials, Science Research Associates. I was in my mid-twenties and as thoroughly acculturated as any senator, bus driver, movie star, or medical doctor. My fundamental acceptances about the universe and humanity's place in it were rock-solid and thoroughly conventional. But it was a stressful time to be alive, in some ways even more stressful than the present. Many people nowadays realize that human life may well be in jeopardy, but this jeopardy exists in some vaguely defined future, twenty or fifty or a hundred years hence. But in those coldest days of the Cold War everyone lived with the realization that a nuclear holocaust could occur literally at any second, without warning. It was very realistically the touch of a button away.
Human life would not be entirely snuffed out in a holocaust of this kind. In a way, it would be even worse than that. In a matter of hours, we would be thrown back not just to the Stone Age but to a level of almost total helplessness. In the Stone Age, after all, people lived perfectly well without supermarkets, shopping malls, hardware stores, and all the elaborate systems that keep these places stocked with the things we need. Within hours our cities would disintegrate into chaos and anarchy, and the necessities of life would vanish from store shelves, never to be replaced. Within days famine would be widespread.
Skills that are taken for granted among Stone Age peoples would be unknown to the survivors — the ability to differentiate between edible and inedible foods growing in their own environment, the ability to stalk, kill, dress, and preserve game animals, and most important the ability to make tools from available materials. How many of you know how to cure a hide? How to make a rope from scratch? How to flake a stone tool? Much less how to smelt metal from raw ore. Commonplace skills of the Paleolithic, developed over thousands of years, would be lost arts.
All this was freely acknowledged by people who didn't doubt for a moment that we were living the way humans were meant to live from the beginning of time, who didn't doubt for a moment that the things our children were learning in school were exactly the things they _should be_ learning.
I'd been hired at SRA to work on a major new mathematics program that had been under development for several years in Cleveland. In my first year, we were going to publish the kindergarten and first-grade programs. In the second year, we'd publish the second-grade program, in the third year, the third-grade program, and so on. Working on the kindergarten and first-grade programs, I observed something that I thought was truly remarkable. In these grades, children spend most of their time learning things that no one growing up in our culture could possibly _avoid_ learning. For example, they learn the names of the primary colors. Wow, just imagine missing school on the day when they were learning _blue._ You'd spend the rest of your life wondering what color the sky is. They learn to tell time, to count, and to add and subtract, as if anyone could possibly fail to learn these things in this culture. And of course they make the beginnings of learning how to read. I'll go out on a limb here and suggest an experiment. Two classes of 30 kids, taught identically and given the identical text materials throughout their school experience, but one class is given no instruction in reading at all and the other is given the usual instruction. Call it the Quinn Conjecture: both classes will test the same on reading skills at the end of twelve years. I feel safe in making this conjecture because ultimately kids learn to read the same way they learn to speak, by hanging around people who read and by wanting to be able to do what these people do.
_Kids learn to read the same way
they learn to speak._
It occurred to me at this time to ask this question: Instead of spending two or three years teaching children things they will inevitably learn anyway, why not teach them some things they will _not_ inevitably learn and that they would actually _enjoy_ learning at this age? How to navigate by the stars, for example. How to tan a hide. How to distinguish edible foods from inedible foods. How to build a shelter from scratch. How to make tools from scratch. How to make a canoe. How to track animals — all the forgotten but still valuable skills that our civilization is actually built on. Of course I didn't have to vocalize this idea to anyone to know how it would be received. Being thoroughly acculturated, I could myself explain why it was totally inane. The way we live is the way humans were meant to live from the beginning of time, and our children were being prepared to enter that life. Those who came before us were savages, little more than brutes. Those who continue to live the way our ancestors lived are savages, little more than brutes. The world is well rid of them, and we're well rid of every vestige of them, including their ludicrously primitive skills.
_Hundreds of ideas were implemented —
and still the schools failed._
Our children were being prepared in school to step boldly into the only fully human life that had ever existed on this planet. The skills they were acquiring in school would bring them not only success but deep personal fulfillment on every level. What did it matter If they never did more than work in some mind-numbing factory job? They could parse a sentence! They could explain to you the difference between a Petrarchan sonnet and a Shakespearean sonnet! They could extract a square root! They could show you why the square of the two sides of a right triangle were equal to the square of the hypotenuse! They could analyze a poem! They could explain to you how a bill passes congress! They could very possibly trace for you the economic causes of the Civil War. They had read Melville and Shakespeare, so why would they not now read Dostoevsky and Racine, Joyce and Beckett, Faulkner and O'Neill? But above all else, of course, the citizen's education — grades K to twelve — prepared children to be fully-functioning participants in this great civilization of ours. The day after their graduation exercises, they were ready to stride confidently toward any goal they might set themselves.
Of course, then, as now, everyone knew that the citizen's education was doing no such thing. It was perceived then — as now — that there was something strangely _wrong_ with the schools. They were failing — and failing miserably — at delivering on these enticing promises. Ah well, teachers weren't being paid enough, so what could you expect? We raised teachers' salaries — again and again and again — and still the schools failed. Well, what could you expect? The schools were physically decrepit, lightless, and uninspiring. We built new ones — tens of thousands, hundreds of thousands of them — and still the schools failed. Well, what could you expect? The curriculum was antiquated and irrelevant. We modernized the curriculum, did our damnedest to make it relevant — and still the schools failed. Every week — then as now — you could read about some bright new idea that would surely "fix" whatever was wrong with our schools: the open classroom, team teaching, back to basics, more homework, less homework, no homework — I couldn't begin to enumerate them all. Hundreds of these bright ideas were implemented — thousands of them were implemented — and still the schools failed.
Within our cultural matrix, every medium tells us that the schools exist to prepare children for a successful and fulfilling life in our civilization (and are therefore failing). This is beyond argument, beyond doubt, beyond question. In _Ishmael_ I said that the voice of Mother Culture speaks to us from every newspaper and magazine article, every movie, every sermon, every book, every parent, every teacher, every school administrator, and what she has to say about the schools is that they exist to prepare children for a successful and fulfilling life in our civilization (and are therefore failing). Once we step outside our cultural matrix, this voice no longer fills our ears and we're free to ask some new questions. Suppose the schools _aren't_ failing? Suppose they're doing exactly what we _really_ want them to do — but don't wish to examine and acknowledge?
Granted that the schools do a poor job of preparing children for a successful and fulfilling life in our civilization, but what things do they do excellently well? Well, to begin with, they do a superb job of keeping young people out of the job market. Instead of becoming wage-earners at age twelve or fourteen, they remain consumers only — and they consume billions of dollars worth of merchandise, using money that their parents earn. Just imagine what would happen to our economy if overnight the high schools closed their doors. Instead of having fifty million active consumers out there, we would suddenly have fifty million unemployed youth. It would be nothing short of an economic catastrophe.
Of course the situation was very different two hundred years ago, when we were still a primarily agrarian society. Youngsters were expected and needed to become workers at age ten, eleven, and twelve. For the masses, a fourth, fifth, or sixth-grade education was deemed perfectly adequate. But as the character of our society changed, fewer youngsters were needed for farm work, and the enactment of child-labor laws soon made it impossible to put ten-, eleven-, and twelve-year-olds to work in factories. It was necessary to keep them off the streets — and where better than in schools? Naturally, new material had to be inserted into the curriculum to fill up the time. It didn't much matter what it was. Have them memorize the capitals of every state. Have them memorize the principle products of every state. Have them learn the steps a bill takes in passing Congress. No one wondered or cared If these were things kids wanted to know or needed to know — or would _ever_ need to know. No one wondered or ever troubled to find out if the material being added to the curriculum was retained. The educators didn't _want_ to know, and, really, what difference would it make? It didn't matter that, once learned, they were immediately forgotten. It filled up some time. The law decreed that an eighth-grade education was essential for every citizen, and so curriculum writers provided material needed for an eighth-grade education.
During the Great Depression it became urgently important to keep young people off the job market for as long as possible, and so it came to be understood that a twelfth-grade education was essential for every citizen. As before, it didn't much matter what was added to fill up the time, so long as it was marginally plausible. Let's have them learn how to analyze a poem, even if they never read another one in their whole adult life. Let's have them read a great classic novel, even if they never read another one in their whole adult life. Let's have them study world history, even if it all just goes in one ear and out the other. Let's have them study Euclidean geometry, even if two years later they couldn't prove a single theorem to save their lives. All these things and many, many more were of course justified on the basis that they would contribute to the success and rich fulfillment that these children would experience as adults. Except, of course, that it didn't. But no one wanted to know about that. No one would have dreamed of testing young people five years after graduation to find out how much of it they'd retained. No one would have dreamed of asking them how useful it had been to them in realistic terms or how much it had contributed to their success and fulfillment as humans. What would be the point of asking _them_ to evaluate their education? What did _they_ know about it, after all? They were just high school graduates, not professional educators.
At the end of the Second World War, no one knew what the economic future was going to be like. With the disappearance of the war industries, would the country fall back into the pre-war depression slump? The word began to go out that the citizen's education should really include four years of college. _Everyone_ should go to college. As the economy continued to grow, however, this injunction began to be softened. Four years of college would sure be good for you, but it wasn't part of the citizen's education, which ultimately remained a twelfth-grade education.
_The curriculum had achieved the status of scripture._
It was in the good years following the war, when there were often more jobs than workers to fill them, that our schools began to be perceived as failing. With ready workers in demand, it was apparent that kids were coming out of school without knowing much more than the sixth-grade graduates of a century ago. They'd "gone through" all the material that had been added to fill up the time — analyzed poetry, diagramed sentences, proved theorems, solved for x, plowed through thousands of pages of history and literature, written bushels of themes, but for the most part they retained almost none of it — and of how much use would it be to them if they had? From a business point of view, these high-school graduates were barely employable.
But of course by then the curriculum had achieved the status of scripture, and it was too late to acknowledge that the program had never been designed to be _useful._ The educators' response to the business community was, "We just have to give the kids more of the same — more poems to analyze, more sentences to diagram, more theorems to prove, more equations to solve, more pages of history and literature to read, more themes to write, and so on." No one was about to acknowledge that the program had been set up to keep young people off the job market — and that it had done a damn fine job of _that_ at least.
_Children are the most fantastic learners in the world._
But keeping young people off the job market is only half of what the schools do superbly well. By the age of thirteen or fourteen, children in aboriginal societies — tribal societies — have completed what we, from our point of view, would call their "education", They're ready to "graduate" and become adults. In these societies, what this means is that their survival value is 100%. All their elders could disappear overnight, and there wouldn't be chaos, anarchy, and famine among these new adults. They would be able to carry on without a hitch. None of the skills and technologies practiced by their parents would be lost. If they wanted to, they could live quite independently of the tribal structure in which they were reared.
But the last thing we want our children to be able to do is to live independently of our society. We don't want our graduates to have a survival value of 100%, because this would make them free to opt out of our carefully constructed economic system and do whatever they please. We don't want them to do whatever they please, we want them to have exactly two choices (assuming they're not independently wealthy). Get a job or go to college. Either choice is good for us, because we need a constant supply of entry-level workers and we also need doctors, lawyers, physicists, mathematicians, psychologists, geologists, biologists, school teachers, and so on. The citizen's education accomplishes this almost without fail. Ninety-nine point nine percent of our high school graduates make one of these two choices.
And it should be noted that our high-school graduates are reliably _entry-level_ workers. We want them to _have_ to grab the lowest rung on the ladder. What sense would it make to give them skills that would make it possible for them to grab the second rung or the third rung? Those are the rungs their older brothers and sisters are reaching for. And if this year's graduates were reaching for the second or third rungs, who would be doing the work at the bottom? The business people who do the hiring constantly complain that graduates know absolutely nothing, have virtually no useful skills at all. But in truth how could it be otherwise?
So you see that our schools are not failing, they're just succeeding in ways we prefer not to see. Turning out graduates with no skills, with no survival value, and with no choice but to work or starve are not _flaws_ of the system, they are _features_ of the system. These are the things the system _must do_ to keep things going on as they are.
_Our schools are not failing, they're just
succeeding in ways we prefer not to see._
The need for schooling is bolstered by two well-entrenched pieces of cultural mythology. The first and most pernicious of these is that children _will not learn_ unless they're compelled to — in school. It is part of the mythology of childhood itself that children _hate_ learning and will avoid it at all costs. Of course, anyone who has had a child knows what an absurd lie this is. From infancy onward, children are the most fantastic learners in the world. If they grow up in a family in which four languages are spoken, they will be speaking four languages by the time they're three or four years old — without a day of schooling, just by hanging around the members of their family, because they desperately want to be able to do the things they do. Anyone who has had a child knows that they are tirelessly curious. As soon as they're _able_ to ask questions, they ask questions incessantly, often driving their parents to distraction. Their curiosity extends to everything they can reach, which is why every parent soon learns to put anything breakable, anything dangerous, anything untouchable up high — and if possible behind lock and key. We all know the truth of the joke about those childproof bottle caps: those are the kind that only children can open.
_The desire to learn is hardwired into the human child._
People who imagine that children are resistant to learning have a nonexistent understanding of how human culture developed in the first place. Culture is no more and no less than the totality of _learned_ behavior and information that is passed from one generation to the next. The desire to eat is not transmitted by culture, but knowledge about how edible foods are found, collected, and processed _is_ transmitted by culture. Before the invention of writing, whatever was not passed on from one generation to the next was simply lost, no matter what it was — a technique, a song, a detail of history. Among aboriginal peoples — those we haven't destroyed — the transmission between generations is remarkably complete, but of course not 100% complete. There will always be trivial details of personal history that the older generation takes to its grave. But the vital material is never lost.
This comes about because the desire to learn is _hardwired_ into the human child just the way that the desire to reproduce is hardwired into the human adult. It's genetic. If there was ever a strain of humans whose children were _not_ driven to learn, they're long gone, because they _could not be_ culture-bearers.
Children don't have to be _motivated_ to learn everything they can about the world they inhabit, they're absolutely _driven_ to learn it. By the onset of puberty, children in aboriginal societies have unfailingly learned everything they need to function as adults.
Think of it this way. In the most general terms, the human biological clock is set for two alarms. When the first alarm goes off, at birth, the clock chimes _learn, learn, learn, learn, learn._ When the second alarm goes off, at the onset of puberty, the clock chimes _mate, mate, mate, mate, mate._ The chime that goes _learn, learn, learn_ never disappears entirely, but it becomes relatively faint at the onset of puberty. At that point, children cease to want to follow their parents around in the learning dance. Instead, they want to follow _each other_ around in the mating dance.
We, of course, in our greater wisdom have decreed that the biological clock regulated by our genes must be ignored.
_They're convinced that children don't
want to learn anything at all — and they
point to school children to prove it._
What sells most people on the idea of school is the fact that the unschooled child learns what it _wants_ to learn _when i_ t wants to learn it. This is intolerable to them, because they're convinced that children don't want to learn anything at all — and they point to school children to prove it. What they fail to recognize is that the learning curve of preschool children swoops upward like a mountain — but quickly levels off when they enter school. By the third or fourth grade it's completely flat for most kids. Learning, such as it is, has become a boring, painful experience they'd love to be able to avoid if they could. But there's another reason why people abhor the idea of children learning what they want to learn when they want to learn it. _They won't all learn the same things!_ Some of them will never learn to analyze a poem! Some of them will never learn to parse a sentence or write a theme! Some of them will never read _Julius Caesar!_ Some will never learn geometry! Some will never dissect a frog! Some will never learn how a bill passes Congress! Well, of course, this is too horrible to imagine. It doesn't matter that 90% of these students will never read another poem or another play by Shakespeare in their lives. It doesn't matter that 90% of them will never have occasion to parse another sentence or write another theme in their lives. It doesn't matter that 90% retain no functional knowledge of the geometry or algebra they studied. It doesn't matter that 90% never have any use for whatever knowledge they were supposed to gain from dissecting a frog. It doesn't matter that 90% graduate without having the vaguest idea how a bill passes Congress. All that matters is that they've _gone through it!_
_People remember the things they need to know._
The people who are horrified by the idea of children learning what they want to learn when they want to learn it have not accepted the very elementary psychological fact that people (all people, of every age) remember the things that are important to them — the things they _need to know_ — and forget the rest. I am a living witness to this fact. I went to one of the best prep schools in the country and graduated fourth in my class, and I doubt very much if I could now get a passing grade in more than two or three of the dozens of courses I took. I studied classical Greek for two solid years, and now would be unable to read aloud a single sentence.
One final argument people advance to support the idea that children _need_ all the schooling we give them is that there is _vastly more material_ to be learned today than there was in prehistoric times or even a century ago. Well, there is of course vastly more material that _can_ be learned, but we all know perfectly well that it isn't being taught in grades K to twelve. Whole vast new fields of knowledge exist today — things no one even heard of a century ago: astrophysics, biochemistry, paleobiology, aeronautics, particle physics, ethology, cytopathology, neurophysiology — I could list them for hours. But are these the things that we have jammed into the K-12 curriculum because everyone needs to know them? Certainly not. The idea is absurd. The idea that children need to be schooled for a long time because there is so much that _can be_ learned is absurd. If the citizen's education were to be extended to include everything that _can be_ learned, it wouldn't run to grade twelve, it would run to grade twelve thousand, and no one would be able to graduate in a single lifetime.
I know of course that there is no one in this audience who needs to be sold on the virtues of homeschooling or unschooling. I hope, however, that I may have been able to add some philosophical, historical, anthropological, and biological foundation for your conviction that school ain't all it's cracked up to be.
©2000 Daniel Quinn
"If I had to make a general rule for living and working with children, it might be this: be wary of saying or doing anything to a child that you would not do to another adult, whose good opinion and affection you valued."
John Holt
# How Do We Know They're Learning?
by Jan Hunt
**I** n Unschooling, the child's current interests are followed, and the parents act not as teachers but as tutors and resource assistants. This approach is often misunderstood, because it is based on assumptions that are quite different from those implicit in conventional schooling.
Unschoolers are often described by what we do _not_ do; we do not "teach"; we do not impose an arbitrary, artificial curriculum; we do not structure the hours of our "school day", But there are so many things we do:
* Answer questions. Many of us believe that this is the most essential aspect of Unschooling.
* Encourage creative and cooperative solutions to problems as they arise.
* Find resources and information to support whatever interests the child is currently exploring.
* Attempt to illustrate, through the daily decisions we make, the benefits of such personal moral qualities as friendship, honesty, and responsibility.
* Model the joy of learning through our own discussions, reading, and research.
While it is not impossible for a conventionally schooling family to pursue the kinds of activities I have described, it is simply more difficult to do so when parents and children have so much less time together, and when even after-school hours are taken up by projects, homework, and other school-related demands. School children can also become used to seeking emotional support from peers rather than parents, and this pattern can be difficult to interrupt even when school is not in session.
The assumption that Unschooling parents somehow lack awareness of their children's progress, and therefore require formal evaluation of that progress, is related to the fact that unschoolers function outside the arena of the schools, and our philosophies and methods are not always well-understood.
How do Unschooling parents know their children are learning? The answer to this question is, to put it most simply, direct observation. I have only one child. If a teacher had only one child in her classroom, and was unable to describe the reading skills of that child, everyone would be dismayed — how could a teacher have such close daily contact with one child and miss something so obvious? Yet many people unfamiliar with Unschooling Imagine that parents with just this sort of close daily contact with their child require outside evaluation to determine that child's progress. This puzzles Unschooling parents, who cannot imagine missing anything so interesting as the nature of their child's learning.
No Unschooling parents have twenty-five children, and we are thus free to focus on the enhancement of learning without being continually distracted by the many time-consuming tasks, unrelated to learning, that are necessary in a classroom situation. This freedom from distraction is a major factor in the establishment of a lively, creative, and joyful learning environment.
Any parent of a toddler could almost certainly tell us how many numbers her child can count to, and how many colors he knows — not through testing, but simply through many hours of listening to his questions and statements. In Unschooling, this type of observation simply continues on into higher ages and more complex learning.
There are many times in the course of a day when a reasonably curious child will want to know the meaning of certain printed words — In books and newspapers, on the computer or television, on board game instruction cards, on package labels, on mail that has just arrived, and so on. If this child's self-esteem is intact, he will not hesitate to ask his parents the meaning of these words. Through the decrease of questions of this type, and the actual reading aloud of certain words, ("Look, Daddy, this package is for you!") it seems safe to assume that reading is progressing in the direction of literacy. This may seem to outsiders to be somewhat imprecise, but unschooling parents learn through experience that more specific evaluation is intrusive, unnecessary, and self-defeating.
_Specific evaluation is intrusive
unnecessary, and self-defeating._
If the government were to establish compulsory evaluation of babies to determine whether they were walking on schedule, everyone would think that was absurd. We all know that healthy babies walk eventually, and that it would be futile and frustrating to attempt to speed up that process — as foolish as trying to speed up the blooming of a rose. Gardeners do not worry about late-blooming roses, or measure their daily progress — they trust in nature's good intentions, meet the needs of the plants under their care, and know that any further intervention would interfere with the natural flow of their growth. Such trust is as essential in the education of a child as it is in gardening. All healthy rose bushes bloom when ready, all healthy babies walk when ready, and all healthy children in a family of readers read when ready — though this may be as late as ten or twelve. There is no need to speed up or measure this process. When a child is free to learn at his own pace, he will continue to love learning throughout his life.
The child's progress is not always smooth; there may be sudden shifts from one stage to the next. Formal evaluation given just prior to such a shift may give unfair and misleading information. At a time when I knew (through a reduction in the number of requests for me to read certain signs, labels, etc.) that my son Jason's reading was improving, but not, as far as I knew, able to read fluently, I told him one evening that I was unable to read to him because I wasn't feeling well. He said, "Well, you can rest and I'll read a book to you." He proceeded to read an entire book flawlessly, at a level of more difficulty than I would have guessed.
_A schedule of intellectual growth
exists within each child._
Thus it sometimes happens in the natural course of living with a child that we receive direct and specific information about his progress. But it should be stressed that this is part of the natural process of supporting a child's learning, and that requiring such direct proof is almost always self-defeating. Had I required him to read the book, he might well have refused, because he would have felt the anxiety which anyone feels when being evaluated. But because he chose to read voluntarily, and his accuracy was not being examined, he had no reason to feel anxious.
Unschooling parents, then, cannot avoid having a good general idea of a child's progress in reading, or in any other area. Without testing for specific learning, we may underestimate a child's abilities to some extent, but all that means is that we make delightful discoveries along the way.
If Unschooling parents do not measure, evaluate and control learning, how can the child himself know when to move on to the next level? If we were to ask a horticulturist how a rose knows when to bloom, he or she could not answer that question; it is simply assumed that such knowledge is built into the wondrous genetics of the seed. A child's schedule of intellectual growth, like the rose's blooming, may indeed be a mysterious process, but it nonetheless exists within each child. Jason, one day at age three, though not yet a fluent reader, taught himself squares and square roots. How could I have guessed that he was ready for that level of mathematics on that particular day? Had I been imposing a standard curriculum, I might have discouraged early math and emphasized reading, and to what end? He is now proficient in, and greatly enjoys, both areas. Ultimately, it made no difference when he achieved this mastery. As John Holt once observed, children are not trains. If a train does not reach every station on time, it will be late reaching its ultimate destination. But a child can be late at any "station", and can even change the entire route of the learning process, and still reach every area of learning.
The Unschooling child not only knows what he needs to learn, but how best to go about learning it. Jason has always devised ingenious ways for learning what is currently in the foreground of his interest. His method for learning squares and square roots — rows and columns of dots on paper — would never have occurred to me, even if I had guessed correctly that he was ready for this subject at that early age. At age 6, he was looking over a new globe, and made a game of guessing which of several pairs of countries was larger in area, then larger in population, and so on. These sorts of games went on constantly; his creativity in designing Interesting learning methods far surpassed my own, and I never had to give a single thought to motivation. My child is not unique; many Unschooling parents have reported just this sort of creativity and joyful learning in their children.
_My role has not been that
of teacher, but of facilitator._
Jason has had no lessons in the conventional sense. He has taught himself, with help as needed and requested by him, reading, writing, math, art and science. However, these subjects are not treated as separate categories, but as parts of the topic of current Interest. My role has not been that of teacher, but of facilitator. I am not merely a passive observer, however. When he asked questions — which he did many times each day, I answered as well as I could. If I couldn't, I became a researcher: I made phone calls, helped him to use the encyclopedia, went with him to the library, or found someone with relevant experience with whom he could learn; whatever helped him to find the answer (today's parents, of course, have the Internet as another resource). This was not merely helpful in answering his specific question, but in the more general sense of modeling the many ways in which information can be obtained.
_If a child learns how to obtain
information, he can apply that skill
throughout his life._
While I did not choose Unschooling for religious reasons, I have always welcomed the time available to explore questions of personal ethics, and to encourage such qualities as kindness, honesty, trust, cooperation, creative solutions to problems, and compassion for others. We have also appreciated having time in the morning to discuss such things as dreams from the previous night and plans for the day ahead, when I would otherwise have been preoccupied with helping him to get ready for school. Believing that modern life is already overly hectic, we try as much as possible to make room for unhurried time in our family.
In an age of "information explosion", it is no longer meaningful or realistic to require rote memorization of specific facts. Not only are these facts meaningless to the child unless they happen to coincide with his own current and unique interests, many of these facts will in any case be outdated by the time he is an adult. But if a child learns how to obtain information, he can apply that skill throughout his life. Regardless of which specific topics were covered, our primary focus has always been "how to learn" and "how to obtain information", As John Holt wrote, "Since we can't know what knowledge will be most needed in the future, it is senseless to try to teach it in advance. Instead, we should try to turn out people who love learning so much and learn so well that they will be able to learn whatever needs to be learned."
©2001 Jan Hunt
Mary's Memoirs by Mary Van Doren
Asking About Numbers
Helen (6) asks a great many multiplication questions: "How much is 6 sixes?" for example. She also asks about addition and subtraction, and recently she realized that the more closely she herself is involved, the more easily she understands things — she figures out more easily how old she will be when her baby sister Alice is five than how old her other sister, Greta, will be, for example.
She knows a lot of numbers. For example, she seems really to _know_ 6: that 2 + 4 = 6, 2 × 3 = 6, 6-1 = 5, etc. — all of those "number facts" that relate to 6. She has also shown a great interest in fractions. And for a while her favorite number was a hundred million: "How far is 100,000,000 inches?" "How long would it take to count to 100,000,000?" "Are there 100,000,000 people?" We get out the calculator and tell her how far 100,000,000 inches is (halfway from here to California), and whatever else we can tell her about what she wants to know.
Helen has also estimated amounts, 8 × 13, for example. She asked, "How much is 8 thirteens? About a hundred, I think." She has also estimated length, with ribbon, quite accurately. We never have lessons in arithmetic — or anything else — we answer questions.
# What is Unschooling?
by Earl Stevens
"What we want to see is the child in pursuit of knowledge, not knowledge in pursuit of the child."
–George Bernard Shaw
**I** t is very satisfying for parents to see their children in pursuit of knowledge. It is natural and healthy for the children, and in the first few years of life, the pursuit goes on during every waking hour. But after a few short years, most kids go to school. The schools also want to see children in pursuit of knowledge, but the schools want them to pursue mainly the _school's_ knowledge and devote twelve years of life to doing so.
In his acceptance speech for the New York City Teacher of the Year award (1990), John Gatto said, "Schools were designed by Horace Mann... and others to be instruments of the scientific management of a mass population." In the interests of managing each generation of children, the public school curriculum has become a hopelessly flawed attempt to define education and to find a way of delivering that definition to vast numbers of children.
The traditional curriculum is based on the assumption that children must be pursued by knowledge because they will never pursue it themselves. It was no doubt noticed that, when given a choice, most children prefer not to do school work. Since, in a school, knowledge is _defined as schoolwork,_ it is easy for educators to conclude that children don't like to acquire knowledge. Thus schooling came to be a method of controlling children and forcing them to do whatever educators decided was beneficial for them. Most children don't like textbooks, workbooks, quizzes, rote memorization, subject schedules, and lengthy periods of physical inactivity. One can discover this — even with polite and cooperative children — by asking them if they would like to add more time to their daily schedule. I feel certain that most will decline the offer.
The work of a schoolteacher is not the same as that of a home-schooling parent. In most schools, a teacher is hired to deliver a ready-made, standardized, year-long curriculum to 25 or more age-segregated children who are confined in a building all day. The teacher must use a standard curriculum — not because it is the best approach for encouraging an individual child to learn the things that need to be known — but because it is a convenient way to handle and track large numbers of children. The school curriculum is understandable only in the context of bringing administrative order out of daily chaos, of giving direction to frustrated children and unpredictable teachers. It is a system that staggers ever onward but never upward, and every morning we read about the results in our newspapers.
But despite the differences between the school environment and the home, many parents begin homeschooling under the impression that it can be pursued only by following some variation of the traditional public school curriculum in the home. Preoccupied with the idea of "equivalent education", state and local education officials assume that we must share their educational goals and that we home-school simply because we don't want our children to be inside their buildings. Textbook and curriculum publishing companies go to great lengths to assure us that we must buy their products if we expect our children to be properly educated. As if this were not enough, there are national, state, and local support organizations that have practically adopted the use of the traditional curriculum and the school-in-the-home image of homeschooling as a de facto membership requirement. In the midst of all this, it can be difficult for a new homeschooling family to think that an alternative approach is possible.
One alternative approach is unschooling, also known as natural learning, experience-based learning, or independent learning. When our local homeschooling support group announced a gathering to discuss Unschooling, we thought a dozen or so people might attend, but more than 100 adults and children showed up. For three hours, parents and some of the children took turns talking about their homeschooling experiences and about unschooling. Many people said afterward that they left the meeting feeling reinforced and exhilarated — not because anybody told them what to do or gave them a magic formula — but because they grew more secure in making these decisions for themselves. Sharing ideas about this topic left them feeling empowered.
_Unschooling isn't a method, it is a way
of looking at children and at life._
Before I talk about what I think unschooling is, I must talk about what it isn't. Unschooling isn't a recipe, and therefore it can't be explained in recipe terms. It is impossible to give unschooling directions for people to follow so that it can be tried for a week or so to see if it works. Unschooling isn't a method, it is a way of looking at children and at life. It is based on trust that parents and children will find the paths that work best for them — without depending on educational institutions, publishing companies, or experts to tell them what to do.
Unschooling does not mean that parents can never teach anything to their children, or that children should learn about life entirely on their own without the help and guidance of their parents. Unschooling does not mean that parents give up active participation in the education and development of their children and simply hope that something good will happen. Finally, since many unschooling families have definite plans for college, unschooling does not even mean that children will never take a course in any kind of a school.
Then what is unschooling? I can't speak for every person who uses the term, but I can talk about my own experiences. Our son has never had an academic lesson, has never been told to read or to learn mathematics, science, or history. Nobody has told him about phonics. He has never taken a test or been asked to study or memorize anything. When people ask, "What do you do?" My answer is that we follow our interests — and our interests inevitably lead to science, literature, history, mathematics, music — all the things that have interested people before anybody thought of them as "subjects",
_Unschooling children do real things all day long._
A large component of unschooling is grounded in doing real things, not because we hope they will be good for us, but because they are intrinsically fascinating. There is an energy that comes from this that you can't buy with a curriculum. Children do real things all day long, and in a trusting and supportive home environment, "doing real things" invariably brings about healthy mental development and valuable knowledge. It is natural for children to read, write, play with numbers, learn about society, find out about the past, think, wonder and do all those things that society so unsuccessfully attempts to force upon them in the context of schooling.
While few of us get out of bed in the morning in the mood for a "learning experience", I hope that all of us get up feeling in the mood for life. Children always do so — unless they are ill or life has been made overly stressful or confusing for them. Sometimes the problem for the parent is that it can be difficult to determine if anything important is actually going on. It is a little like watching a garden grow. No matter how closely we examine the garden, it is difficult to verify that anything is happening at that particular moment. But as the season progresses, we can see that much has happened, quietly and naturally. Children pursue life, and in doing so, pursue knowledge. They need adults to trust in the inevitability of this very natural process, and to offer what assistance they can. Parents come to our Unschooling discussions with many questions about fulfilling state requirements. They ask: "How do unschoolers explain themselves to the state when they fill out the paperwork every year?", "If you don't use a curriculum, what do you say?" and "What about required record-keeping?" To my knowledge, unschoolers have had no problems with our state department of education over matters of this kind. This is a time when even many public school educators are moving away from the traditional curriculum, and are seeking alternatives to fragmented learning and drudgery.
_Children pursue life, and in doing
so, pursue knowledge._
When I fill out the paperwork required for homeschooling in our state, I briefly describe, in the space provided, what we are currently doing, and the general intent of what we plan to do for the coming year. I don't include long lists of books or describe any of the step-by-step skills associated with a curriculum. For example, under English and Language Arts, I mentioned that our son's favorite "subject" is the English language. I said a few words about our family library. I mentioned that our son reads a great deal and uses our computer for whatever writing he happens to do. I concluded that, "Since he already does so well on his own, we have decided not to introduce language skills as a subject to be studied. It seems to make more sense for us to leave him to his own continuing success."
Unschooling is a unique opportunity for each family to do whatever makes sense for the growth and development of their children. If we have a reason for using a curriculum and traditional school materials, we are free to use them. They are not a universally necessary or required component of unschooling, either educationally or legally. Allowing curriculums, textbooks, and tests to be the defining, driving force behind the education of a child is a hindrance in the home as much as in the school — not only because it interferes with learning, but because it interferes with trust. As I have mentioned, even educators are beginning to question the pre-planned, year-long curriculum as an out-dated, 19th century educational system. There is no reason that families should be less flexible and innovative than schools.
Anne Sullivan, Helen Keller's mentor and friend, said:
"I am beginning to suspect all elaborate and special systems of education. They seem to me to be built up on the supposition that every child is a kind of idiot who must be taught to think. Whereas, if the child is left to himself, he will think more and better, if less showily. Let him go and come freely, let him touch real things and combine his impressions for himself, instead of sitting indoors at a little round table, while a sweet-voiced teacher suggests that he build a stone wall with his wooden blocks, or make a rainbow out of strips of coloured paper, or plant straw trees in bead flower-pots. Such teaching fills the mind with artificial associations that must be got rid of, before the child can develop independent ideas out of actual experiences." (Helen Keller, _The Story of My Life,_ 1902)
Unschooling provides a unique opportunity to step away from systems and methods, and to develop independent ideas out of actual experiences, where the child is truly in pursuit of knowledge, not the other way around.
©1994 Earl Stevens
"Play is the highest form of research."
Albert Einstein
# Learning Through Play
by Jan Hunt
My son Jason, now a young adult, has been unschooled from the beginning. We were fortunate to have discovered John Holt's books when Jason was two, and never looked back.
Jason was a very inquisitive child, who loved learning new words and playing with numbers. He had an extensive vocabulary by 18 months, understood the concept of infinity at 2, and taught himself squares and square roots at 3. In spite of all this, I still wondered if I should use a curriculum, especially for math. It was hard not to worry when taking a path that was so different from the one I had taken in childhood. It was also hard not to be affected by my parents' doubts, even though I understood the reasons for their skepticism.
When Jason was 7, he asked for a math book as his special holiday gift that year, after we read John Holt's glowing review of Harold Jacobs' book _Mathematics: A Human Endeavor,_ in _Growing Without Schooling._ The book proved to be as wonderful as Holt had said, and we enjoyed it a lot. But a few months later, I noticed that Jason hadn't looked at it for a while. I decided to suggest reading a chapter per week together. Fortunately, I was busy that day and didn't get around to asking him. That evening, Jason came up to me, book in hand, saying "Let's play math." My first thought was, "Whew, that was a close one." Had I made my offer, he probably would have accepted it, and even learned from it, but where would the concept of _math as play_ have gone?
When Jason was 8, my neighbor, who also had an 8-year-old son, asked me if Jason knew the times tables, and when I said he did, she asked me how he had learned it. Her son had struggled for months, and still had trouble remembering the answers. He was frustrated and worried about his grades, but none of her ideas had helped. I explained that Jason learned everything in a very natural way, as needed. For example, his dad had brought home a dart board, just for fun, a few months back. Scoring a darts game involves both addition and multiplication, and because Jason wanted to be the scorekeeper, he learned all the number combinations used for darts (and later learned other combinations as he needed them), though the dartboard had not been purchased with that in mind, nor had we ever used the term "times tables".
_Unschooling isn't a technique; it's
living and learning naturally, lovingly,
and respectfully together._
Now, Jason can do math in his head, unlike me. Having memorized formulas, I can solve most math problems, but always on paper, and I rarely understand the concepts Involved. Jason can not only do the math easily but really understands the whole process. If he happens to need a new mathematical tool, he can easily learn it. He needed to know about sines and cosines when he converted paintings into graphics for my children's book _A Gift for Baby._ He learned this quickly and easily from the Internet. I could only look back and remember how much time I had spent memorizing calculus formulas, and though I passed all the tests, I really hadn't learned anything. I didn't understand how the formulas actually worked, or how to use them in the real world.
Jason has learned much of what he knows through play, and has the same love of learning he was born with. He learned about money by playing Monopoly, about spelling by playing Scrabble, about strategies by playing chess, Clue, and video games, about our culture by watching classic and modern TV shows and films, about politics and government by watching "Yes, Minister", about grammar by playing Mad Libs, about fractions by cooking, about words by playing Dictionary, and writing skills by reading P. G. Wodehouse. He learns about life through living it. But all of this learning has taken place more incidentally than intentionally, as part of the larger business of living life freely and naturally.
During a recent newspaper interview for an article on unschooling, the reporter asked me which techniques unschoolers use that could be used by parents of children in school. I explained that unschooling isn't a technique; it's living and learning naturally, lovingly, and respectfully together. As my friend and unschooling parent Mary Van Doren once wrote:
"Raising children with an emphasis on intrinsic rewards is not a technique, a method or a trick to get them to do what the parent wants them to by subtler means, but a way of life, a way of living with children with real respect for their intelligence and for their being."
I feel indebted to John Holt and other unschooling writers for encouraging me to trust Jason to know what he needed and wanted to learn and how to go about learning it. But my best teacher has always been my son. For parents who went to school, unschooling can be a challenge, but it is also our best opportunity to learn to trust our children's natural love of learning.
©2007 Jan Hunt
Mary's Memoirs by Mary Van Doren
Telling Stories
We did some camping this past summer, and decided not to bring books. Between nightfall and sleep, Helen (3) wanted to be read to — not possible. So we thought about telling stories, but no stories came to mind. How could that be? We quickly found a jumping-off point: we were camping, so Mark told us about camping as a Boy Scout.
Now we know we can always have relevant stories at any time. Helen loves to hear about Mama and Papa when they were children. She also likes to hear about herself when she was a baby and even what we did yesterday. I think as we go along we may start making up some stories too. We have enjoyed this a lot, though we certainly lacked confidence at first — the written word is so powerful in our lives.
# What About College?
by Rue Kream
"School was the unhappiest time of my life and the worst trick it ever played on me was to pretend that it was the world in miniature. For it hindered me from discovering how lovely and delightful and kind the world can be, and how much of it is intelligible."
– E. M. Forester
_Don't you worry that your kids will be unprepared when the time comes for them to leave the nest? What about college? I don't see how my kids will be prepared for the real world if they don't go to school, and I can't imagine dealing with a teenager at home all the time._
Our goal is that there will not be a particular moment when our children must suddenly be pushed from the nest. Our hope is that, by allowing our children to seek out and take responsibility in their own time with our guidance and support, but not pressure, they will experience a smoother transition into adulthood and will think of us as a safety net rather than an obstacle.
The problems of a typical teen/parent relationship are created in large part by the dynamics inherent in a system that separates parents and children from the time the child turns five years old (or often earlier). It is very difficult to retain a connected relationship when the amount of time spent together is so minimal and is so often spent in preparation for the next day's tasks. Add to that the fact that a mainstream teenager has little to no control over her own life or time, has little opportunity to pursue what interests her, is dealing with the flood of emotions that hormones bring, is put on a schedule that does not permit her the sleep her body needs, and lives in a society that encourages her to pull away from her parents at a certain age whether she wants to or not, and you have a recipe for unhappiness.
Our society has choreographed a "typical" progression from child to adult, and expects all teenagers to travel the same path. A person who doesn't feel comfortable on that path is a rebel or a delinquent. A child who is not ready to move on as quickly as another child might be is perceived as immature or spoiled or "slow". A child who is ready to move on more quickly than others has no opportunity to do so. Unschooling gives each child the time and the room to follow her own path and to travel that path with the loving support and companionship of her family.
The groundwork we lay with our kids when they are young is vitally important to our future relationships. The relationships I have with Dagny and Rowan are open, honest, and respectful. We have our difficult moments, just as anyone in a relationship does, but overall it's a pleasure to spend time with them. I have no reason to believe that will change at any particular "teen" age. We will ride the swells of hormones and growing pains together, and each of them will leave the nest in her own time and way.
I do not worry that they will be unprepared, because I trust that they will know when the time is right. They will have spent a lifetime making their own decisions about what they are capable of. Just as they knew when they were ready to tie their shoes, take off their training wheels, or watch a scary movie, they will know when they are ready to fly. They aren't in preparation for anything. They live in the real world right now, and it is a wonderful, amazing, challenging, beautiful, extraordinary place.
In _Teach Your Own,_ John Holt wrote, "I used to say, and say now, that a college degree isn't a magic passkey that opens every door in town. It opens only a few, and before you spend a lot of time and money getting one of those keys, it's a good idea to find out what doors it opens (if any), and what's on the other side of those doors, and to decide whether you like what's on the other side, and if you do, whether there may not be an easier way to get there." Rowan and Dagny will decide for themselves whether college is the way to get where they want to go.
When people say that school prepares children for the real world, what's implied is that it is the difficult parts of school (doing things you don't want to do, forced interaction with peers, following rules that you don't believe in) that are important. What's implied is that the real world is going to be an unhappy place and that being treated unfairly by people is a part of life.
_The real world is what we make it._
It may be a part of life in school, but it is not a part of our lives. School is as far away from the real world as possible. In school we learn that we cannot control our own destinies and that it is acceptable to let others govern our lives. In the real world we can take responsibility for choosing our own paths and governing our own lives. The real world is what we make it. As unschoolers we can choose to make it fascinating and loving and peaceful, and we can immerse ourselves in it every day.
I believe that having your time regulated by bells, eating on a schedule, having very little privacy or opportunity for self-determination, having to ask permission to perform bodily functions, and having to think on command, causes nothing but a feeling of fear when you are finally let loose into the world. It does nothing to help you to live a joyful life.
No adult is forced to sit when she wants to run, listen when she wants to sing, draw when she wants to read, or be inside when she wants to be outside. The real lessons that children learn in school do nothing to improve their lives as adults and do much to hinder a joyful childhood. In _The Six-Lesson Schoolteacher,_ John Taylor Gatto (New York City Teacher of the Year in 1990) lists the six lessons he believes are really taught in school:
* "The first lesson I teach is: 'Stay in the class where you belong.' I don't know who decides that my kids belong there but that's not my business."
* "The second lesson I teach kids is to turn on and off like a light switch."
* "The third lesson I teach you is to surrender your will to a predestined chain of command. Rights may be granted or withheld, by authority, without appeal. As a schoolteacher I intervene in many personal decisions, issuing a Pass for those I deem legitimate, or initiating a disciplinary confrontation for behavior that threatens my control."
* "The fourth lesson I teach is that only I determine what curriculum you will study. (Rather, I enforce decisions transmitted by the people who pay me.)"
* "In lesson five I teach that your self-respect should depend on an observer's measure of your worth. My kids are constantly evaluated and judged."
* "In lesson six I teach children that they are being watched. I keep each student under constant surveillance and so do my colleagues. There are no private spaces for children; there is no private time."
These are lessons that an unschooled child never learns, and not one of them will help a child live in joy or contribute to her growing up to be a happy and autonomous adult.
© 2005 Rue Kream
"If you have a garden and a library, you have everything you need."
Marcus Tullius Cicero (106-43 BCE)
# Learning to Trust
by Jan Hunt
It's only natural for parents to feel uneasy and uncertain when contemplating a path for their children other than the one they themselves traveled. Those of us who decide to unschool—even when we are convinced that this is the best option for our child—must unlearn many unfounded assumptions about learning that we were conditioned to believe for so many years. If we can do that, we can rediscover the natural love of learning we were born with.
Most of us were taught at school to see a false dichotomy between "learning" and "fun". We came to believe that if it's "educational", It can't be fun, and if it's fun, it can't be learning! A child who is unschooled from the beginning, as my son Jason has been, enjoys life free of such preconceptions, and continues to see all learning as a wondrous and rewarding experience.
Schools operate under the very different assumption that learning can be imposed from outside the child through various types of coercion, manipulation, rewards and punishments, and that there are distinct deadlines a child "has to" reach or he will never "catch up" (one might ask: catch up with whom, and why?) These are false assumptions, but it can be difficult to let them go when they were so ingrained in our own childhood.
While an understanding of the true spirit of learning comes naturally to Jason, I have had to unlearn many of these assumptions. In that sense, Jason has been my mentor, continually reminding me that learning is not restricted to a specific curriculum, location, time of day, or even to the presence of a "teacher". Jason has taught himself much of what he now uses in his work as our Natural Child Project webmaster and editor.
In a way, we are a generation with a most difficult task, because we are truly forging new trails and gaining new understandings. As I often remind parents in my workshops, unschooling should be much easier when children who were themselves unschooled choose this path for their own children. For these new parents, unschooling will be the norm, and they will have no need to unlearn so many well-meant but harmful beliefs. They will have a much simpler and truer understanding: every child grows at their own natural pace, and, like flower gardeners, parents simply need to trust their children's unique schedules.
_Most of us were taught at school
to see a false dichotomy
between "learning" and "fun"._
Just as we trust a rose to bloom on its own built-in timetable, so too should we expect children to bloom at their own best pace, and in their own way. There is so much time for a child to grow! If he or she reads fluently at three, at six, or even at twelve, what difference does that really make in the long run? The only real difference it can make is a positive one: a child who is trusted to read when he is ready has the best chance of enjoying a lifetime of pleasurable reading. Yet, because we attended years of school, such understandings can be hard to grasp. Every unschooling parent has likely felt intimidated and unsure at some point. Unschooling is a leap of faith for any parent who attended school in their own childhood.
Jason is now a young adult. When I look back over the years, I see joyful, enthusiastic learning that I have been privileged to share. It has been a happy experience that couldn't have been further from the six hours of drudgery that I had first imagined it would be! Jason not only enjoys learning many things, he sees learning as an interesting, integral part of all life, not a separate activity confined to specific locations, days, or times. In that sense, he is still unschooling and always will be. For Jason, this path has been far more than just an alternative to formal schooling; it has prepared him to live a life full of curiosity and wonder.
_Living is learning._
As John Holt once wrote, "Living is learning." This statement appears in an engaging collection of Holt's letters called _A Life Worth Living._ Judging from my own experience, I believe that Unschooling is a leap worth taking, one that can lead to a life worth living. But this leap does not need to be attempted alone. Unschooling friends and support groups, books, articles, and websites can be very enlightening. But most of all, we can allow our child to teach us how joyful and natural learning can be. For instruction on Unschooling, our children are the single best source of encouragement, inspiration, and reassurance we can have.
©2003 Jan Hunt
"Coercion or compulsion never brings about growth. It is freedom that accelerates evolution."
Paramahansa Yogananda
# Every Waking Hour
by John Holt
Among the many things I have learned about children, learned by many, many years of hanging out with them, watching carefully what they do, and thinking about it, is that children are natural learners.
The one thing we can be sure of, or surest of, is that children have a passionate desire to understand as much of the world as they can, even what they cannot see and touch, and as far as possible to acquire some kind of skill, competence, and control in it and over it. Now this desire, this need to understand the world and be able to do things in it, the things the big people do, is so strong that we could properly call it biological. It is every bit as strong as the need for food, for warmth, for shelter, for comfort, for sleep, for love. In fact, I think a strong case could be made that it might be stronger than any of these.
A hungry child, even a tiny baby who experiences hunger as real pain, will stop eating or nursing or drinking if something interesting happens, because that little child wants to see what it is. This curiosity, this desire to make some kind of sense out of things, goes right to the heart of the kind of creatures that we are.
_Children are natural learners._
Children are not only extremely good at learning, they are much better at it than we are. As a teacher, it took me a long time to find this out. I was an ingenious and resourceful teacher, clever about thinking up lesson plans and demonstrations and motivating devices and all of that ackamarackus. And I only very slowly and painfully—believe me, painfully—learned that when I started teaching less, the children started learning more.
I can sum up in five to seven words what I eventually learned as a teacher. The seven-word version is: Learning is not the product of teaching. The five-word version is: Teaching does not make learning. As I mentioned before, organized education operates on the assumption that children learn only when and only what and only because we teach them. This is not true. It is very close to one hundred percent false.
_Teaching does not make learning._
Learners make learning. Learners create learning. The reason that this has been forgotten is that the activity of learning has been made into a product called "education", just as the activity, the discipline, of caring for one's health has become the product of "medical care", and the activity of inquiring into the world has become the product of "science", a specialized thing presumably done only by people with billions of dollars of complicated apparatus. But health is not a product and science is something you and I do every day of our lives. In fact, the word _science_ is synonymous with the word _learning._
What do we do when we make learning, when we create learning? Well, we observe, we look, we listen. We touch, taste, smell, manipulate, and sometimes measure or calculate. And then we wonder. We say, "Well, why this?" or "Why is it this way?" or "Did this thing make this thing happen?" or "What made this thing happen?" or "Can we make it happen differently or better?" or "Can we get the Mexican bean beetle off the beans?" or "Can we raise more fruit?" or "Can we fix the washing machine?" or whatever it might be. And then we invent theories, what scientists call hypotheses; we make hunches. We say, "Well, maybe it's because of this", or "Perhaps it's because of that", or "Maybe if I do this, this will happen." And then we test these theories or these hypotheses.
We may test them simply by asking questions of people we think know more than we do, or we may test them by further observation. We may say, "Well, I don't quite know what that thing is, but maybe if I watch it longer I will find out." Or maybe we do some kind of planned experiment—"Well, I'll try putting this on the beans and see if it does something to the bean beetles", or "I'll try doing something else." And from these, in various ways, we either find out that our hunch was not so good, or perhaps that it was fairly good, and then we go on, we observe some more, we speculate some more. We ask more questions, we make more theories, we test them.
This process creates learning, and we all do it. it's not just done by people at MIT or Rensselaer Polytechnic. We do it. And this is exactly what children do. They are hard at work at this process all their waking hours. When they're not actually eating and sleeping, they're creating knowledge. They are observing, thinking, speculating, theorizing, testing, and experimenting—all the time—and they're much better at it than we are. The idea, the very idea, that we can teach small children how to learn has come to me to seem utterly absurd.
_Children learn from anything and everything they see._
As I was writing this, there came, as if by wonderful coincidence, a long letter from a parent. At one point she says something that is so good that it could be a title for this book: "Every Time I think of Something to Teach Them They Already Know It."
Children learn from anything and everything they see. They learn wherever they are, not just in special learning places. They learn much more from things, natural or made, that are real and significant in the world in their own right and not just made in order to help children learn; in other words, they are more interested in the objects and tools we use in our regular lives than in almost any special learning materials made for them. We can best help children learn, not by deciding what we think they should learn and thinking of ingenious ways to teach it to them, but by making the world, as far as we can, accessible to them, paying serious attention to what they do, answering their questions—if they have any—and helping them explore the things they are most interested in, The ways we can do this are simple and easily understood by other people who like children and will take the trouble to pay some attention to what they do and think about what it may mean. In short, what we need to know to help children learn is not obscure, technical, or complicated, and the materials we can use to help them lie ready to hand all around us.
Excerpted from _Learning All the Time_ by John Holt. © 2005. Reprinted by arrangement with Basic Books, a member of the Perseus Books Group (perseusbooks.com). All rights reserved.
## Editors
**Jan Hunt** is the Director of The Natural Child Project at naturalchild.org, a member of the Board of Directors for the Canadian Society for the Prevention of Cruelty to Children, and a member of the Advisory Boards for Attachment Parenting International and Child-Friendly Initiative.
**Jason Hunt,** Jan's son, co-edited and designed the layout for this book. He is the designer and webmaster of naturalchild.org and the Global Children's Art Gallery, and has helped edit much of Jan's writing, including her book _The Natural Child._ He has unschooled all of his life.
## Publisher
**The Natural Child Project,** established in 1996, provides information and support for attachment parenting, natural learning, and child advocacy.
Our website offers articles and advice by leading writers on parenting, unschooling, and child advocacy. The site also features parenting quotes, recommended books, related resources, the Attachment Parenting Family Directory, and the Global Children's Art Gallery. Our online shop offers many unique items for parents and children.
Our vision is a world in which all children are treated with dignity, respect, understanding, and compassion. In such a world, every child can grow into adulthood with a generous capacity for love and trust.
naturalchild.org
## Contributors
**Nanda Van Gestel** is a long-time unschooling mom of four sons living in the Netherlands. She believes that raising children, with as much love and freedom as possible, is the most fulfilling and important job in the world. Nanda enjoys writing about unschooling, motherhood, freedom and love.
**Jan Hunt** is a parenting counselor and the author of _The Natural Child: Parenting From the Heart_ and the children's book _A Gift for Baby._ She has published articles in numerous journals and parenting publications, as well as her own website at naturalchild.org.
**Daniel Quinn** is best known as the author of the highly acclaimed _Ishmael._ Other works offering inspired solutions to global challenges include _The Story of B, My Ishmael, Beyond Civilization, After Dachau, The Holy, Tales of Adam,_ and _If They Give You Lined Paper, Write Sideways._ For more information visit ishmael.org.
**Rue Kream** is living happily ever after with her husband, Jon, and two children, Dagny and Rowan. Rue is a passionate advocate of unschooling and respectful parenting. Her insightful first book, _Parenting A Free Child: An Unschooled Life,_ is available at freechild.info.
**Earl Stevens** was a founder of the Southern Maine Home Education Support Network and an advocate for unschooling in Maine and nationally for many years. He has been a popular columnist in _Home Education Magazine_ and a writer for his own publication, _Talk About Learning._
**Kim Houssenloge** lives with her husband Mark and son Lewis in a small country town on the far south coast of New South Wales, Australia, where they have followed a natural learning approach from the start.
**John Holt** was a pioneer of the American unschooling movement, editor of _Growing Without Schooling,_ and a leading advocate of unschooling and children's rights. An eloquent writer, John's many celebrated books include _How Children Learn, Teach Your Own,_ and _Learning All the Time._
**Mary Van Doren,** her husband Mark, and their daughters Helen, Greta, Alice and Veronica are long-time unschoolers living in Ohio. Mary is currently working as a children's librarian. She and Mark are former Board members of _Growing Without Schooling._
## How to Order this Book
Order online: naturalchild.org/shop
Order by phone: 877-593-1547
About this book: naturalchild.org/unmanual
Email us: naturalchild.org/contact
## Counseling with Jan
Jan Hunt, M.Sc. Counseling Psychology, has over twenty years of experience as a counselor and writer on attachment parenting and unschooling, and is the author of _The Natural Child: Parenting from the Heart_ and _A Gift for Baby._ She offers telephone counseling worldwide, with a focus on solutions that meet the needs of both parents and children.
Jan would be happy to talk with you! For more information or to schedule a session, write to jan@naturalchlld.org.
naturalchild.org/counseling
## Also available from naturalchild.org
_The Natural Child_ by Jan Hunt Paperback and Audio
_A Gift for Baby_ by Jan Hunt Art by Sunny Rosanbalm
Children's Art Prints, Cards, and Clothing
_Parenting for a Peaceful World_ by Robin Grille
Custom Hand-Made Nursing Dolls
Parenting Cards 100 Gentle Reminders
naturalchild.org/shop
"Parents have a prior right to choose the kind of education that shall be given to their children."
United Nations
Universal Declaration of Human Rights
|
It’s the battle of the YA heroines in the first wave of Teen Choice Awards Nominations, announced Tuesday by Fox.
Jennifer Lawrence and Shailene Woodley both scored multiple nominations for multiple films, while on the TV side, it’s vampires versus vicious high school girls as “The Vampire Diaries” and “Pretty Little Liars” dominated the nominations with five each.
Lawrence, a previous winner for “The Hunger Games,” is nominated for the franchise sequel “The Hunger Games: Catching Fire” in Best Actress: Sci-Fi/Fantasy while also picking up a Best Actress: Drama nomination for her Oscar-nominated role in “American Hustle.” Woodley, for her part, is nominated for “Divergent” in Best Actress: Action and also “The Fault in Our Stars” in Best Actress: Drama.
See also: Can ‘Fault in Our Stars’ Still Clean Up After Early Box Office Flurry?
Starting today, fans ages 13-19 can vote once each day per category for their favorite nominees at www.teenchoiceawards.com. |
SAN JOSE, Calif. -- The Detroit Red Wings seemingly were out of steam a minute into the third period Sunday.
They had been getting dominated by the San Jose Sharks for two periods and were trailing by two goals.
But, with their season again on the line, the Red Wings stormed back with three unanswered goals in the third period for a 4-3 victory against the Sharks in Game 5 of the Western Conference semifinals at HP Pavilion.
Now, the pressure squarely is on San Jose as the series returns to Joe Louis Arena for Game 6 on Tuesday with the Sharks leading 3-2.
Jimmy Howard was outstanding early, keeping his team in the game, as San Jose outshot Detroit 42-22. Pavel Datsyuk was dominant despite playing with a sore wrist, setting up three goals.
"Just says a lot about the nature of the room," Howard said. "We have that presence, guys with a lot of leadership. There’s a lot of character on our team and never any quit. Our season was on the line in the third period and we found a way."
Tomas Holmstrom snapped a 3-3 tie at 13:52 of the third period, tipping in a blast by Nicklas Lidstrom. Datsyuk stripped the puck from Patrick Marleau along the boards to set up Lidstrom’s shot.
"I gave it to Nick. I know Nick shoots pretty well," Datsyuk said. "Having Homie in front of the net is good with this one. I’m happy for him doing his job well."
Said Howard of Datsyuk: "He just amazes us every single night. He’s a world-class player."
The Red Wings needed to kill an elbowing penalty on Justin Abdelkader with five minutes remaining to seal the win. San Jose went 0-for-4 on the power play. The Red Wings are doing a better job of denying the Sharks clean entries into the zone, but Howard was their best penalty killer.
RED WINGS 4, SHARKS 3
Key play:
Pavel Datsyuk took the puck from Patrick Marleau along the boards and passed to Nicklas Lidstrom, whose shot was tipped in by Tomas Holmstrom with 6:08 to play in the third period, snapping a 3-3 tie.
Hero:
Datsyuk, playing with a sore wrist that prevented him from taking faceoffs, was dominant. He assisted on three goals.
Goat:
Patrick Marleau has had a disastrous series. The Sharks' leading scorer during the regular season has done nothing offensively and got beat badly by Datsyuk on the play that led to the winning goal.
Analysis:
The Red Wings have showed tremendous resolve in the third period of the past two games with the season on the line. They never have up hope and truly believed they were capable of digging out of the 0-3 series hole they faced. The Sharks blew an opportunity to put the Red Wings away and now San Jose, with its history of playoff failure, must be feeling really tight.
"Howie really stole us the game tonight," Danny Cleary said. "He played huge, a lot of big saves."
The situation looked bleak for Detroit after Logan Couture slipped a backhand shot underneath Howard during a breakaway 54 seconds into the third period to give San Jose a 3-1 lead.
But the Red Wings answered quickly on goals by Jonathan Ericsson and Cleary 1:46 apart to tie it. Ericsson fired a loose puck past Antti Niemi from in front of the net at 3:43. Cleary scored on a good second effort during a wraparound attempt.
Joe Pavelski gave the Sharks a 2-0 lead at 15:32 of the second period, scoring on a two-on-one with Ryane Clowe. But Niklas Kronwall responded quickly for the Red Wings, taking a pass from Datsyuk and whipping in a shot at 16:25 while Holmstrom created some havoc in front.
"We responded both times (they jumped ahead by two) and that made a real difference for us and just hanging around," Babcock said.
"I don’t know what it was, but we’re not happy with the way we played in the the first two periods," Kronwall said. "I thought we did a better job in the third, hard on pucks in both zones."
The Sharks dominated territorially for most of the first period, outshooting the Red Wings 16-7 and taking a 1-0 lead at 17:18 when Devin Setoguchi tipped in a shot by Dan Boyle.
San Jose couldn’t put Detroit away when it had the chance and it cost the Sharks.
"We weren’t giving up at all, kept going after them, getting pucks at the net, trying to be active with the (defense),"
Lidstrom said. "We owe it to (Howard). He kept us in the game in the first two periods."
The Red Wings are trying to become just the fourth team in NHL history to win a series after trailing 3-0.
"We’re basically playing Game 7 every game here," Kronwall said. "We were able to steal one here. Jimmy played great, kept us in the game, gave us a chance to win."
But, Babcock knows they must be much better at home.
"Tonight, we weren’t as good as we could have been," Babcock said. "We’re going to be better for sure in Game 6." |
The share price of First Solar (NASDAQ:FSLR) slipped today, reaching and then falling even farther past its previous 52-week low of $21.50 to $21.22. The stock was trading on below-average volume. The stock price is down 1.8% with a volume of two million. The stock is trading at 71.5% of its 50-day moving average and 46.8% of its 200-day moving average.
While trading on below-average volume, Enerplus (NYSE:ERF) declined today, hitting and then dropping past its previous 52-week low to $20.37. While trading at a volume of 1.5 million, the stock price has fallen 1.7%. The stock has fallen over the last three months, dropping $4.64 (-18.3%) from $25.34 on January 9, 2012. The stock is trading at 88.9% of its 50-day moving average and 83% of its 200-day moving average.
The share price of Ultra Petroleum (NYSE:UPL) dipped today, reaching and then falling even farther past its previous 52-week low of $21.10 to $20.85. The stock was trading on below-average volume. Shares have fallen 1.6%, trading at a volume of 1.5 million. The stock has lost momentum over the last three months, losing $8.81 (-29.7%) from $29.67 on January 9, 2012. The stock is trading at 88.7% of its 50-day moving average and 73.6% of its 200-day moving average.
While trading on below-average volume, France Telecom (NYSE:FTE) fell today, hitting and then dropping past its previous 52-week low to $13.82. Trading at a volume of 1.3 million, the stock price is down 1.1%. The stock has been falling in the last two months, down $1.57 (-10.1%) from a price of $15.52 on February 7, 2012. The stock is trading at 92.8% of its 50-day moving average and 87.3% of its 200-day moving average.
The share price of IAMGOLD Corporation (NYSE:IAG) decreased today, reaching and then falling even farther past its previous 52-week low of $12.12 to $12.06. The stock was trading on below-average volume. On volume of 1.1 million shares, the stock price is down 2.1%. Over the last three months, the stock has lost $4.93 (-28.9%) from a price of $17.05 on January 9, 2012. The stock is trading at 84.9% of its 50-day moving average and 70.3% of its 200-day moving average.
While trading on below-average volume, Bill Barrett Corporation (NYSE:BBG) sunk today, hitting and then dropping past its previous 52-week low to $24.21. The stock price has fallen 1.1% with a volume of 303,614. The stock is down over the last three months, having fallen $10.31 (-29.8%) from $34.61 on January 9, 2012. The stock is trading at 87.9% of its 50-day moving average and 72.2% of its 200-day moving average.
The share price of James River Coal Company (NASDAQ:JRCC) dropped today, reaching and then falling even farther past its previous 52-week low of $4.96 to $4.75. The stock was trading on below-average volume. Shares are down 2.8% and trading at a volume of one million. The stock has fallen over the last three months, dropping $2.61 (-35%) from $7.46 on January 9, 2012. The stock is trading at 83.9% of its 50-day moving average and 64.5% of its 200-day moving average.
While trading on below-average volume, Cloud Peak Energy (NYSE:CLD) declined today, hitting and then dropping past its previous 52-week low to $15.24. Trading at a volume of 232,635, the stock price is down 1.2%. The stock is down over the last two months, having fallen $4.44 (-22.5%) from a price of $19.70 on February 7, 2012. The stock is trading at 89.3% of its 50-day moving average and 80.5% of its 200-day moving average.
While trading on above-average volume, AngioDynamics (NASDAQ:ANGO) slipped today, hitting and then dropping past its previous 52-week low to $11.73. While trading at a volume of 278,076, the stock price has fallen 3.6%. The stock has lost momentum over the last three months, losing $1.81 (-13.3%) from $13.63 on January 9, 2012. The stock is trading at 93.5% of its 50-day moving average and 86.5% of its 200-day moving average. |
Ferry Route Level of Service Transportation service level measurements have been commonly used and accepted for highway systems, but similar service measures for ferry systems are less common, especially from the users point of view. An approach to measuring ferry route level of service is described that allows comparisons among ferry routes and between ferries and alternate modes such as highways (i.e., drive-around choices) and transit. The recommended approach focuses on excess user waiting times (excess delay) by mode (automobile, registered carpool or vanpool, bus, truck, and walk-on passenger), combined with calibrated relationships between volume-to-capacity (V/C) ratio and user delays for forecasting purposes. Data on waiting times for vehicles in the queues were collected on all ferry routes serviced by Washington State Ferries, and an extensive statistical analysis was performed to compute the relationships between V/C ratios and excess waiting times. Excess delay was defined as the waiting time for missed vessel sailings due to overloads, if any, after a ferry patron has arrived at the dock. User delays were expressed in two forms: absolute number of minutes of waiting time, and the number of boat sailings missed before boarding a ferry. The boat wait concept was introduced to differentiate between excess delays caused by congestion that prevents a driver from boarding the next ferry, and delays related to the amount of service provided on a route as reflected in the headways between vessels. |
<reponame>talenguyen/AndroidCore
package com.tale.androidcore.utils;
import android.animation.Animator;
import android.animation.AnimatorListenerAdapter;
import android.animation.ObjectAnimator;
import android.view.View;
import android.view.ViewGroup;
/**
* Created by tale on 5/14/15.
*/
public class Animations {
// ==== show/hide by TRANSLATION_Y from TOP edge. ====
public static Animator showTranslateDownAnimator(View target, float offset, long duration) {
if (offset > 1) {
offset = 1;
} else if (offset < 0) {
offset = 0;
}
target.setVisibility(View.VISIBLE);
// Move to offscreen.
final float translationY = target.getBottom() * offset;
target.setTranslationY(-translationY);
// Create animator to animate view to onscreen.
ObjectAnimator animator = ObjectAnimator.ofFloat(target, View.TRANSLATION_Y, 0);
if (duration > 0) {
animator.setDuration(duration);
}
return animator;
}
public static Animator hideTranslateUpAnimator(View target, float offset, long duration, boolean autoGone) {
if (offset > 1) {
offset = 1;
} else if (offset < 0) {
offset = 0;
}
target.setVisibility(View.VISIBLE);
// Move to original position.
target.setTranslationY(0);
// Create animator to animate view to offscreen
final float translationY = target.getBottom() * offset;
ObjectAnimator animator = ObjectAnimator.ofFloat(target, View.TRANSLATION_Y, -translationY);
if (duration > 0) {
animator.setDuration(duration);
}
if (autoGone) {
addAutoGone(animator, target);
}
return animator;
}
// ==== show/hide by ALPHA. ====
public static Animator showAlphaAnimator(View target, long duration) {
target.setVisibility(View.VISIBLE);
target.setAlpha(0f);
final ObjectAnimator animator = ObjectAnimator.ofFloat(target, View.ALPHA, 1f);
if (duration > 0) {
animator.setDuration(duration);
}
return animator;
}
public static Animator hideAlphaAnimator(View target, long duration) {
target.setVisibility(View.VISIBLE);
target.setAlpha(1f);
final ObjectAnimator animator = ObjectAnimator.ofFloat(target, View.ALPHA, 0f);
if (duration > 0) {
animator.setDuration(duration);
}
return animator;
}
// ==== show/hide by TRANSLATION_Y from BOTTOM edge. ====
public static Animator hideTranslateDownAnimator(View target, float offset, long duration, boolean autoGone) {
final ViewGroup parent = (ViewGroup) target.getParent();
final float translationY = (parent.getHeight() - target.getTop()) * offset;
ObjectAnimator animator = ObjectAnimator.ofFloat(target, View.TRANSLATION_Y, translationY);
if (duration > 0) {
animator.setDuration(duration);
}
if (autoGone) {
addAutoGone(animator, target);
}
return animator;
}
public static Animator showTranslateUpAnimator(View target, float offset, long duration) {
final ViewGroup parent = (ViewGroup) target.getParent();
final float translationY = (parent.getHeight() - target.getTop()) * offset;
// Move to offscreen.
target.setTranslationY(translationY);
// Create animator to animate view to onScreen
ObjectAnimator animator = ObjectAnimator.ofFloat(target, View.TRANSLATION_Y, 0);
if (duration > 0) {
animator.setDuration(duration);
}
return animator;
}
private static void addAutoGone(ObjectAnimator animator, View target) {
animator.addListener(new AnimatorListenerAdapter() {
@Override
public void onAnimationEnd(Animator animation) {
super.onAnimationEnd(animation);
animation.removeListener(this);
target.setVisibility(View.GONE);
}
});
}
}
|
<filename>denops/ddc/types.ts
import { autocmd } from "./deps.ts";
export { BaseSource } from "./base/source.ts";
export { BaseFilter } from "./base/filter.ts";
export type DdcEvent = autocmd.AutocmdEvent | "Auto" | "Manual";
export type SourceName = string;
export type Custom = {
source: Record<SourceName, SourceOptions>;
option: DdcOptions;
};
export type Context = {
filetype: string;
input: string;
};
type CompletionMode = "inline" | "popupmenu" | "manual";
export type DdcOptions = {
autoCompleteEvents: DdcEvent[];
completionMode: CompletionMode;
filterOptions: Record<string, Partial<FilterOptions>>;
filterParams: Record<string, Partial<Record<string, unknown>>>;
keywordPattern: string;
sourceOptions: Record<SourceName, Partial<SourceOptions>>;
sourceParams: Record<SourceName, Partial<Record<string, unknown>>>;
sources: SourceName[];
};
export type SourceOptions = {
converters: string[];
forceCompletionPattern: string;
ignoreCase: boolean;
mark: string;
matchers: string[];
maxAutoCompleteLength: number;
maxCandidates: number;
minAutoCompleteLength: number;
sorters: string[];
};
export type FilterOptions = {
// TODO: add options and remove placeholder
placeholder: void;
};
export type Candidate = {
word: string;
abbr?: string;
menu?: string;
info?: string;
kind?: string;
dup?: boolean;
userData?: unknown;
};
// For internal type
export type DdcCandidate = Candidate & {
icase: boolean;
equal: boolean;
source: SourceName;
};
|
New role and molecular mechanism of Gadd45a in hepatic fibrosis. AIM To investigate the role of Gadd45a in hepatic fibrosis and the transforming growth factor (TGF)-/Smad signaling pathway. METHODS Wild-type male BALB/c mice were treated with CCl4 to induce a model of chronic liver injury. Hepatic stellate cells (HSCs) were isolated from the liver of BALB/c mice and were treated with small interfering RNAs (siRNAs) targeting Gadd45a or the pcDNA3.1-Gadd45a recombinant plasmid. Cellular -smooth muscle actin (-SMA), -actin, typeIcollagen, phospho-Smad2, phospho-Smad3, Smad2, Smad3, and Smad4 were detected by Western blots. The mRNA levels of -SMA, -actin, and typeIcollagen were determined by quantitative real-time (qRT)-PCR analyses. Reactive oxygen species production was monitored by flow cytometry using 2,7-dichlorodihydrofluorescein diacetate. Gadd45a, Gadd45b, anti-Gadd45g, typeIcollagen, and SMA local expression in liver tissue were measured by histologic and immunohistochemical analyses. RESULTS Significant downregulation of Gadd45a, but not Gadd45b or Gadd45g, accompanied by activation of the TGF-/Smad signaling pathways was detected in fibrotic liver tissues of mice and isolated HSCs with chronic liver injury induced by CCl4 treatment. Overexpression of Gadd45a reduced the expression of extracellular matrix proteins and -SMA in HSCs, whereas transient knockdown of Gadd45a with siRNA reversed this process. Gadd45a inhibited the activity of a plasminogen activator inhibitor-1 promoter construct and (CAGA)9 MLP-Luc, an artificial Smad3/4-specific reporter, as well as reduced the phosphorylation and nuclear translocation of Smad3. Gadd45a showed protective effects by scavenging reactive oxygen species and upregulating antioxidant enzymes. CONCLUSION Gadd45a may counteract hepatic fibrosis by regulating the activation of HSCs via the inhibition of TGF-/Smad signaling. |
EMERYVILLE, CAL. (SatireWire.com) — As random as they are relevant, enigmatic as they are enlightening, search engines have earned a slightly sullied reputation as a necessary evil. But it is a one-sided assessment. The search engines have not been able to explain themselves. Until now.
Thanks to its sophisticated program, which answers questions with phrases or sentences, Jeeves of AskJeeves.com granted SatireWire Editor Andy Marlatt the opportunity to actually interview a search engine. There were many important questions to ask. Unfortunately, he never got to most of them.
NOTE: These are real screen captures of actual responses. Advertisements appearing with results have been edited out, and the query boxes have been enlarged to allow readers to view entire questions. This does not in any way alter the responses. |
import { render } from '@testing-library/react-native'
import { AuctionsFaq } from './AuctionsFaq'
jest.mock('@shared-contexts/ThemeProvider')
describe('Auctions FAQ screen', () => {
it('should match snapshot', async () => {
const rendered = render(<AuctionsFaq />)
expect(rendered.toJSON()).toMatchSnapshot()
})
})
|
I co-led a conversation earlier today on the topic “Time Management: Tips & Tools for Managers Who Don’t Have Enough Time” with a team of plant managers. I had five ideas I wanted to foster conversation and teamwork around, and none of those ideas were complex or complicated. (I may share them with you at a later time.)
At one point in the conversation, one of the managers mentioned something along the lines of “It’s not rocket science – we just need to do a better job of practicing these things.”
My gut and head went in two different directions at that point.
PRODUCTIVE FLOURISHING
My head told me I was wasting their time. Make it harder! Razzle-dazzle them! Throw in a bunch of statistics and graphs! Pull out a few five-syllable words!
My gut told me I had succeeded in what I had set out to do. They have a simple language they all understand and can use to support each other in applying the concepts in their day-to-day activities. Their heads are full of a bunch of other technical, hard-to-work-through stuff – they don’t need another complex or complicated idea. Luckily, my gut won today.
PRODUCTIVE FLOURISHING
It’s not rocket science. That’s why it works. I learned this through a bit of education and a lot of trial by fire when I was a military leader and I’ve done my best to practice and teach it since then.
It reminds me of the line from Chapter 53 of the Tao Te Ching:
“The Tao is broad and plain
But people like the side paths”
We love those side paths. They’re interesting. They give us a convenient justification about why we didn’t just get it done. They let us play safe and small or not be responsible for the wake of our actions.
Then there’s Peter Drucker: “What you have to do and the way you have to do it is incredibly simple. Whether you are willing to do it, that’s another matter.”
Simple, effective ideas incorporated into a team’s everyday communication and practice lead to one of five things:
clarity about requirements vs. capabilities conversations about priorities, focus, and accountability an examination of the systems and processes that are relevant to the work in question honest discussion about whether the team has the right mix of people great execution
Every manager wants great execution; few are willing to work through the foundations to get there.
Leadership isn’t rocket science, either. Inconvenient Business Truth #13: Simple Does Not Equal Easy. |
/*
* This file is part of TechReborn, licensed under the MIT License (MIT).
*
* Copyright (c) 2017 TechReborn
*
* Permission is hereby granted, free of charge, to any person obtaining a copy
* of this software and associated documentation files (the "Software"), to deal
* in the Software without restriction, including without limitation the rights
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
* copies of the Software, and to permit persons to whom the Software is
* furnished to do so, subject to the following conditions:
*
* The above copyright notice and this permission notice shall be included in all
* copies or substantial portions of the Software.
*
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
* SOFTWARE.
*/
package techreborn.compat.crafttweaker;
import crafttweaker.CraftTweakerAPI;
import crafttweaker.IAction;
import crafttweaker.api.item.IIngredient;
import crafttweaker.api.item.IItemStack;
import crafttweaker.api.minecraft.CraftTweakerMC;
import net.minecraft.item.ItemStack;
import reborncore.common.util.ItemUtils;
import stanhebben.zenscript.annotations.ZenClass;
import stanhebben.zenscript.annotations.ZenMethod;
import techreborn.api.reactor.FusionReactorRecipe;
import techreborn.api.reactor.FusionReactorRecipeHelper;
import java.util.ArrayList;
import java.util.List;
@ZenClass("mods.techreborn.fusionReactor")
public class CTFusionReactor {
@ZenMethod
public static void addRecipe(IIngredient topInput, IIngredient bottomInput, IItemStack output, int startEU, int euTick, int tickTime) {
FusionReactorRecipe reactorRecipe = new FusionReactorRecipe((ItemStack) CraftTweakerCompat.toObject(topInput), (ItemStack) CraftTweakerCompat.toObject(bottomInput), CraftTweakerCompat.toStack(output), startEU, euTick, tickTime);
CraftTweakerAPI.apply(new Add(reactorRecipe));
}
@ZenMethod
public static void removeTopInputRecipe(IIngredient iIngredient) {
CraftTweakerAPI.apply(new RemoveTopInput(iIngredient));
}
@ZenMethod
public static void removeBottomInputRecipe(IIngredient iIngredient) {
CraftTweakerAPI.apply(new RemoveTopInput(iIngredient));
}
private static class Add implements IAction {
private final FusionReactorRecipe recipe;
public Add(FusionReactorRecipe recipe) {
this.recipe = recipe;
}
@Override
public void apply() {
FusionReactorRecipeHelper.registerRecipe(recipe);
}
@Override
public String describe() {
return "Adding Fusion Reactor recipe for " + recipe.getOutput().getDisplayName();
}
}
@ZenMethod
public static void removeRecipe(IItemStack output) {
CraftTweakerAPI.apply(new Remove(CraftTweakerCompat.toStack(output)));
}
private static class Remove implements IAction {
private final ItemStack output;
List<FusionReactorRecipe> removedRecipes = new ArrayList<FusionReactorRecipe>();
public Remove(ItemStack output) {
this.output = output;
}
@Override
public void apply() {
for (FusionReactorRecipe recipeType : FusionReactorRecipeHelper.reactorRecipes) {
if (ItemUtils.isItemEqual(recipeType.getOutput(), output, true, false)) {
removedRecipes.add(recipeType);
FusionReactorRecipeHelper.reactorRecipes.remove(recipeType);
break;
}
}
}
@Override
public String describe() {
return "Removing Fusion Reactor recipe for " + output.getDisplayName();
}
}
private static class RemoveTopInput implements IAction {
private final IIngredient output;
List<FusionReactorRecipe> removedRecipes = new ArrayList<FusionReactorRecipe>();
public RemoveTopInput(IIngredient output) {
this.output = output;
}
@Override
public void apply() {
for (FusionReactorRecipe recipeType : FusionReactorRecipeHelper.reactorRecipes) {
if (output.matches(CraftTweakerMC.getIItemStack(recipeType.getTopInput()))) {
removedRecipes.add(recipeType);
FusionReactorRecipeHelper.reactorRecipes.remove(recipeType);
break;
}
}
}
@Override
public String describe() {
return "Removing Fusion Reactor recipe";
}
}
@SuppressWarnings("unused")
private static class RemoveBottomInput implements IAction {
private final IIngredient output;
List<FusionReactorRecipe> removedRecipes = new ArrayList<FusionReactorRecipe>();
public RemoveBottomInput(IIngredient output) {
this.output = output;
}
@Override
public void apply() {
for (FusionReactorRecipe recipeType : FusionReactorRecipeHelper.reactorRecipes) {
if (output.matches(CraftTweakerMC.getIItemStack(recipeType.getBottomInput()))) {
removedRecipes.add(recipeType);
FusionReactorRecipeHelper.reactorRecipes.remove(recipeType);
break;
}
}
}
@Override
public String describe() {
return "Removing Fusion Reactor recipe";
}
}
}
|
#!/usr/bin/python2
# server.py
# nroberts - 4/24/2017
import socket
import sys
import pickle
from tennis_show import TennisShow
import current_bridge
import Queue
from threading import Thread
thread_continuing = True
def main(bridge, port = '8000', *args):
serversocket = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
serversocket.bind(('0.0.0.0', int(port)))
serversocket.listen(2) # max 1 connection
inqueue = Queue.Queue()
outqueue = Queue.Queue()
show = TennisShow(bridge(), inqueue=outqueue, outqueue=inqueue)
show_thread = Thread(target=lambda: show.run(framerate=40))
show_thread.start()
conn1, addr1 = serversocket.accept() # block until client connects
print "Accepted %s" % conn1
conn2, addr2 = serversocket.accept()
print "Accepted %s" % conn2
def read1():
global thread_continuing
while thread_continuing:
buf = conn1.recv(4096)
if len(buf) > 0:
message = pickle.loads(buf)
outqueue.put(message)
read1thread = Thread(target=read1)
read1thread.start()
def read2():
global thread_continuing
while thread_continuing:
buf = conn2.recv(4096)
if len(buf) > 0:
message = pickle.loads(buf)
outqueue.put(message)
read2thread = Thread(target=read2)
def write():
global thread_continuing
while thread_continuing:
data = inqueue.get() # Block until we get something
print "Emitting to clients %s" % str(data)
pickled = pickle.dumps(data)
conn1.send(pickled)
conn2.send(pickled)
writethread = Thread(target=write)
writethread.start()
try:
read2thread.run()
finally:
global thread_continuing
thread_continuing = False
conn1.close()
conn2.close()
serversocket.close()
if __name__ == "__main__":
main(current_bridge.bridge, *sys.argv[1:])
|
Book Review: Diagnosing genius: the life and death of Beethoven The events of Beethoven's life have captured the popular imagination, making him the subject of innumerable biographies and at least two recent bio-pics. One question which has puzzled his biographers is how Beethoven could compose sublime music while labouring under ill health, particularly his deafness. Francois Mai, a professor of psychiatry at the University of Ottawa, offers some answers. Drawing on material from a wide range of sources, Mai makes good use of both primary and secondary works. Contemporary accounts of the composer's health are accessible in Beethoven's own writings, as well as those of his many physicians. To these Mai adds modern diagnostic tools, such as a toxicological analysis of a lock of Beethoven's hair. Despite the wide range of evidence presented, much of Mai's analysis is likely to frustrate the medical historian. In Diagnosing genius Mai is principally concerned with the description and interpretation of the medical evidence. Aiming at comprehensiveness, Mai endeavours to provide a more complete interpretation of the symptoms than has previously been achieved. He ranges over a wealth of conditions, from alcoholism, to syphilis, to lead poisoning, to assess the role each may have played in the cause of Beethoven's death. But such analysis cries out for historical contextualization. At several points throughout the book, for instance, Mai provides descriptions of Beethoven's relations with his many physicians. Famously irascible, Beethoven hired and fired physicians with an impressive regularity, largely depending on whether or not he approved of the treatment they prescribed. Yet Mai provides no discussion of the extensive historiography on the doctorpatient relationshipa central context for understanding Beethoven's behaviour. In his final chapter, Mai broadens his discussion to encompass the links between illness and creativity. Summarizing many of the insights of other authors on the subject, he systematically considers the effects that isolation, psychopathology, substance dependency and medical health problems may have on an individual's creativity. He suggests that conditions which Beethoven, and others, suffered may have fed their creativity, though the effect becomes deleterious if the illness is severe. Whilst careful in rehearsing the research of others, it is a pity that Mai seldom offers his own opinion. In the same chapter Mai asserts that though Beethoven was not a child prodigy like Mozart, he did display exceptional talent (p. 179). But this raises an interesting question, one which Mai does not address: to what extent is genius a social construction? In a fascinating aside, which sadly Mai does not capitalize upon, he reveals that the construction of the composer's reputation had a helping hand from his alcoholic father. Determined that his young son should be seen as a child prodigy, Beethoven's father concealed Beethoven's real birth date, putting it about that his son was two years younger than he actually was. This was a fact Beethoven himself only learned in his mid-forties, when circumstances required him to send for his birth certificate. But child prodigy or not, Mai's interpretation reveals a deep reverence for the composer, one which will brook no opposition to Beethoven's claims to eminence. Mai's careful research is a worthy addition to the genre of medical biography, a field of scholarship which seeks to establish what individuals really suffered from. For the medical historian, however, the value of the book is diminished by its emphasis on retrospective diagnosis and its disregard for contemporary historiography. Like its subject, Diagnosing genius displays a deafness of its owna deafness to historical context. |
A couple more beers coming your way!
BEER DETAILS:
Raised Eyebrows - This beer utilizes passionfruit and guavas from our tree in our parking lot. 100% aged in red wine barrels. The beer is fermented with our house cultures that contain lactobacillus, pediococcus, and brettanomyces. The exotic, ripe qualities of the unique fruit comes through as refreshingly zippy, aromatic, funky, and fun! 4%
1400 bottle release. $15 per bottle (4 bottle limit)
Company of Parrots - An ode to the colorful and noisy group of parrots that fly over Highland Park Brewery almost every single day! We tried to match their intensity with the creation of this barrel fermented and aggressively hopped farmhouse ale! Huge hop character is bursting out of the glass with amplified tropical fruit notes from the Simcoe, Nelson, and Mosaic. Barrel fermentation and our house mixed culture give background notes of brett funk, oak, and just a tinge of acidity. This is a fun one! 6.8% abv.
600 bottle release, $11 per bottle (2 bottle limit)
PURCHASING DETAILS (Starting Dec. 4th):
Online bottle sales start Fri. December 4th at noon and will end Friday December 11th at 11am (1 hours before release) or when bottles are sold out. Any additional bottles not sold online will be available for purchase the day of the event.
PICKUP DETAILS (Starting Friday Dec. 11th at noon):
Purchased bottles will be available at The Hermosillo starting on Friday, Dec. 11th at noon, and the beer will be on tap as well. Bottles will be available for pickup for two weeks following (until December 27th). Photo ID is required for pickup. Proxies are allowed, just email highlandparkbrewery@gmail.com providing both the purchaser and proxies names.
***Bottles not picked up by Dec. 27th will be considered forfeited.***
Please note that we will not issue refunds, exchanges, or returns. All sales are final. |
Effects of Exercise Dose and Detraining Duration on Mobility at Late Midlife: A Randomized Clinical Trial Background: Office workers near retirement tend to be sedentary and can be prone to mobility limitations and diseases. We examined the dose effects of exergaming volume and duration of detraining on motor and cognitive function in office workers at late midlife to reduce sedentariness and mobility limitations. Methods: In an assessor-blinded randomized trial, 160 workers aged 5565 years performed physically active video games in a nonimmersive form of virtual reality (exergaming) in small, supervised groups for 1 h, 1, 2, or 3/week for 8 weeks followed by detraining for 8 and 16 weeks. Exergaming comprises high-intensity, full-body sensorimotor coordination, balance, endurance, and strengthening exercises. The primary outcome was the 6-minute walk test (6MWT), and secondary outcomes were body mass, self-reported physical activity, sleep quality, Berg Balance Scale, Short Physical Performance Battery, fast gait speed, dynamic balance, heart rate recovery after step test, and 6 cognitive tests. Results: The 3 groups were not different in any of the outcomes at baseline (all p > 0.05). The outcomes were stable and had acceptable reliability (intraclass correlation coefficients ≥0.334) over an 8-week control period. Training produced an inverted U-shaped dose response of no (1), most (2), and medium (3/week) effects of exergaming volume in most motor and selected cognitive outcomes. The distance walked in the 6MWT (primary outcome) increased most (94 m, 19%, p < 0.05), medium (57 m, 12%, p < 0.05), and least (4 m, 1%) after exergaming 2, 3, or 0 (control) (all different p < 0.05). The highest responders tended to retain the exercise effects over 8 weeks of detraining, independent of training volume. This maintenance effect was less consistent after 16 weeks of detraining. Conclusion: Less was more during training and lasted longer after detraining. A medium dose volume of exergaming produced the largest clinically meaningful improvements in mobility and selected cognitive tests in 60-year-old office workers with mild mobility limitations and intact cognition. Introduction Physical inactivity is the fourth leading risk factor for mortality, causing 6% of deaths globally. Office workers near retirement tend to be sedentary and can be prone This is an Open Access article licensed under the Creative Commons Attribution-NonCommercial-4.0 International License (CC BY-NC) (http://www.karger.com/Services/OpenAccessLicense), applicable to the online version of the article only. Usage and distribution for commercial purposes requires written permission. DOI: 10.1159/000513505 to mobility limitations and/or diseases. Office and call center workers' daily step counts are the lowest among all workers and have the greatest sitting time and least time spent on light physical activity (PA) during wakeful hours and at work. Movement interventions and behavioral nudges in the office can increase in-office standing time and PA, but such changes often do not exceed 30 min per day, and behavioral improvements diminish over time in and away from the office, minimizing any lasting benefits for mobility, cognition, and health. The World Health Organization (WHO) recommends that adults perform 75 min of vigorous-and/or 150 min of moderate-intensity PA per week to reduce risks for noncommunicable conditions such as cardiovascular disease, cancer, falls, and dementia. The dosing of exercise duration and frequency is, however, unclear from the guidelines and can vary according to age, disease, outcome (mortality, health, and fitness ), and motor and cognitive status at baseline. Large epidemiological studies suggest that a minimum of 150-300 min of moderate-or 75-150 min of vigorousintensity PA was associated with substantial increase in longevity benefits, which can further increase when the duration of total weekly PA is up to 450-750 min/week. However, there is also evidence suggesting that if participants are in a physically deconditioned, sedentary state, being active less than the recommended volume may already have health-promoting effects and sets people on a stable track of healthy aging independent of the type of PA. It is thus conceivable that even 1 per week, that is, low-volume exercise, could produce favorable effect, and 3 per week, that is, high volume of vigorous exercise, could produce a ceiling if not an overtraining effect in certain measures. We thus hypothesized that a medium volume (duration) of exercise may produce the largest effects on mobility and cognitive outcomes in office workers at late midlife. For logistical and adherence reasons, many exercise research interventions are designed to last for a few months. Even if individuals exercise for a prolonged period, vacation, illness, or moving can interrupt the exercise regimen. Detraining, that is, the withdrawal of the exercise stimulus following a period of exercise training, is a contentious and poorly understood phenomenon, and the cumulative effects of detraining on mobility and cognition in adults at late midlife have rarely been examined. Indeed, there is evidence for lasting effects of various exercise protocols on functional outcomes and in a limited number of studies on measures of cognition in older adults. However, there is an equal number of studies showing no maintenance of the exercise-induced functional and cognitive gains after detraining. The common element emerging from these conflicting data that gives rise to our working hypothesis is that perhaps the lasting effects of exercise are actually not related to exercise parameters (i.e., intensity, volume, and frequency) but are instead related to the magnitude of gains in a given outcome. Based on the conflicting data, we tentatively hypothesize that retention of exercise-induced benefits following detraining is related to whether or not someone responds to the exercise stimulus and not to the volume of exercise performed. Taken together, the purpose of this single-blind, randomized trial was to determine the effects of exercise volume and detraining duration on mobility and cognitive outcomes in office workers at late midlife. Because of evidence suggesting strong effects on walking capacity and cognitive function, we used exergaming as an exercise stimulus that was also highly effective in patient groups, producing long-lasting effects following detraining. Participants and Design Full-time, public sector office workers (n = 160, 58% female) participated in the study. Of the 345 employees, 160 were enrolled in an institution-mandated periodic health screening in the hospital where they received information about the study (online Appendix 1; see www.karger.com/doi/10.1159/000513505 for all online suppl. material). Volunteers responding to the call subsequently filled in medical and health questionnaires. The inclusion criteria were age 55-65 years, male or female gender, and a commitment to the 28-week-long program according to randomization. The exclusion criteria were Mini-Mental State Examination (MMSE) score <20, severe cardiac disease, uncontrolled diabetes, uncontrolled hypertension, BMI >30 kgm −2, stroke or heart attack <1 year before, traumatic brain injury, seizure disorder, Parkinson's disease, ongoing orthopedic surgeries, pacemaker, hemophilia, current cancer, current severe cardiopulmonary conditions, use of steroids or opioids for pain, walking aids, or participation in an exercise program. A hospital physician examined all participants and decided about inclusion. The University Hospital's Ethics Committee approved the protocol and the informed consent, which each participant signed (IKEB0008/2018). The study is in agreement with the latest version of the Declaration of Helsinki. Online Appendix 1 shows the design of the 3-arm, single-blind, randomized clinical trial. A physical therapist not involved in the trial performed the concealed randomization of participants. He drew a colored ribbon from a covered box and attached one ribbon to each participant folder, designating the participant's group assignment. Group 1 (G1, n = 53, 47% F) completed an initial 8-week control period to assess reliability of the outcome measures followed by an 8-week-long exercise intervention, concluding with one 8-week detraining period. Group 2 (G2, n = 53, 32% F) and Gerontology 2021;67:403-414 DOI: 10.1159/000513505 group 3 (G3, n = 54, 48% F) exercised for 8 weeks and completed two 8-week-long detraining periods. There were 4 assessments: at baseline (test 1), after 8 weeks of control (G1) or exercise (G2 and G3, test 2), 8 weeks of training (G1) or detraining (G2 and G3) (test 3), and 8 weeks of additional detraining (test 4). The order of the fitness tests was standardized among participants and testing sessions. Two physical therapists (PTs) and an assistant administering the tests were masked to group assignments and experimental phase (control, exercise, and detraining). Interventions Exercise was administered 1 (G1), 2 (G2), or 3 (G3) times per week for 1 h in the hospital PT gym after work on weekdays in groups of 8-10 participants in fall 2019. The warm-up program consisted of stationary cycling for 10 min at 1-2 kg resistance. Exergaming consisted of physically active video games in a nonimmersive form of virtual reality, illustrated previously by videos. Games included Xbox 360 modules, 10 min each: Reflex Ridge trains reflex responses to visual stimuli; Space Pop trains spatial orientation through target reaching with arms, legs, and whole body, and Just Dance prompts users to generate and combine movement sequences with a strong demand on the aerobic system. Exergaming was designed to improve walking ability, gait stability, turning, postural control, and static and dynamic balance. While in the present study we did not measure heart rate during exergaming, in previous studies using similar interventions, the heart reached 80% of the age-predicted maximum, implying a high aerobic training stimulus. In addition to exergaming, strengthening exercises included plank positions for a total of 10 min. Cooldown consisted of stretching and breathing exercises. Two PTs and an assistant administered the interventions but none of the tests. For the control period (G1) and for the detraining periods (G1, G2, and G3), participants were instructed to continue their habitual activities without changing their diet and exercise habits. Primary Outcome Because exercise duration was the dosing factor in the intervention, which was delivered at a high intensity (rate of perceived exertion can reach ∼16 of 20 ), we set the 6-minute walk test (6MWT) as the primary outcome. The 6MWT is a reliable and valid measure of walking capacity that is sensitive to change. The clinically meaningful large change is 50 m for the 6MWT in mobility-limited older adults but near 35 m for healthier and younger adults because exercise training increased 6MWT distance by an average of 36 m in 9 studies of healthy older adults. Secondary Outcomes PA was measured with the International Physical Activity Questionnaire (IPAQ) self-administered, short form, which has acceptable measurement properties for monitoring levels of PA among 18-to 65-year-old adults in diverse settings. Sleep quality and quantity was self-assessed with the Pittsburgh Sleep Quality Index (PSQI) that has good psychometric properties. Because the intervention was multimodal (e.g., balance, coordination, agility, reaction time, and endurance), we also assessed mobility with a composite measure, the Short Physical Performance Battery, SPPB, which is a reliable and valid test of standing balance, habitual walking speed, and leg strength and is sensitive to change. The Berg Balance Scale (BBS) was used to assess static balance and fall risk. Dynamic balance during walking was assessed on 4-, 8-, and 12-cm-wide, 4-m-long, and 2-cm-high wooden beams. Participants performed one familiarization and one measurement trial on each beam barefoot. The instruction was "Walk the entire length of the beam at your habitual speed with arms free. The trial ends when you step off." The distance, number of steps, and time to complete the trial were determined and average velocity computed, by digitizing foot markers in video recordings. Moreover, fast walking speed was measured in 3 trials over 10 m, including acceleration and deceleration. Cardiovascular endurance was assessed with the 3-min-long box step test, and recovery heart rate during minute 4 was used to assess change in fitness. We used MMSE to measure cognitive impairment. We measured motor speed and attention with the Digit Symbol Substitution Test (DSST) and inhibition of cognitive interference with the Stroop color-word test. Memory span and working memory were measured with the Digit Span (DS) and Visual Memory Span (VMS) Forward and Backward. Statistical Analyses Using G*Power (version 3.1.9.2.), we estimated the number of participants needed for a significant group (G1, G2, and G3) by time (pre and post) interaction for the primary outcome. A priori power analysis revealed the need for 44 participants per group with a clinically meaningful increase of 50 m in the 6MWT, the primary outcome, producing a medium effect of 0.5 ( = 0.05; power = 1 − of 0.8). Because the reliability analysis showed stability of the measures upon a retest after 8 weeks (n = 53) and there were no differences between the 3 groups in the outcome measures at baseline, the main analysis was a 1-way ANOVA on post minus pre (delta) scores for each outcome measure. Continuous variables were normally distributed based on the Shapiro-Wilk test. Categorical variables were analyzed with a Kruskal-Wallis test. A significant effect, characterized by p 2 effect size, was interpreted as a group by time interaction and was followed by a Tukey's post hoc or a Mann-Whitney test to determine the means that differed from one another. Within-group changes were further quantified by computing Cohen's d. The Holm method was used to correct for family-wise error. The level of significance was set at p < 0.05. Effects of 8 Weeks of Intervention Body mass decreased by 3.1% in the 3 groups combined (p < 0.05). The weight loss was similar in G2 (4.6%) and G3 (3.8%) but greater than in G1 (0.6%). The number of days participants reported to perform vigorous PA did not change, but the number of minutes of vigorous and moderate PA increased overall by ∼70 min in the 3 groups (p < 0.05). The largest changes occurred in G2 (∼2 h), followed by G3 (90 min) and G1 (∼10 min). The number of minutes participants reported they walked in a day in-creased by ∼9 min in G3, 66 min in G1, and 57 min in G2. Daily sitting time (8.5 h) decreased by 0.9 h in the 3 groups (p < 0.05), by 1.7 h, the most, in G2, compared with the ∼1 h reduction in G3 without changes in G1. Sleep quality decreased the most in G3 by one unit (p < 0.05). Scores in BBS improved by 13% in the 3 groups (p < 0.05), the most in G2 (20%) compared with G3 (13%). These changes exceeded the 2 points or 7% increase in G1. SPPB improved overall by 0.7 points or 9%: the 13 and 11% improvements in G2 and G3 were similar, and these changes exceeded the small changes in G1. Fast gait speed improved overall by 0.5 m/s or 26% (p < 0.05), with similar increases of 39% (0.69 m/s) and 28% (0.49 m/s) in G2 and G3, exceeding the 11% (0.2 m/s) increase (p > 0.05) in G1. There was a ceiling effect for beam walking distance on the 12-and 8-cm-wide beams (no further data are shown for these conditions). On the 4-cm beam, distance walked increased overall by ∼0.5 m (p < 0.05). The improvements in G2 (0.9 m, 166%) exceeded the changes in G3 (0.3 m) and G1 (0.2 m). The number of steps on the beam increased overall by 1.2 or 73%. The time to complete the trials overall increased by ∼1.0 s. Heart rate dur-ing the step test decreased by 10 bmin −1 (p < 0.05). The decreases were more (p < 0.05) in G2 and G3 than in G1 (p > 0.05). Of the 6 cognitive tests, performance in DSST improved in the 3 groups combined by 13% (p < 0.05), driven by the 17% in G2 (p < 0.05). The performance in the Stroop test improved by −13 and −9% in G2 and G3 (both p < 0.05), more than the 4% change in G1 (p > 0.05). Effects of 8 and 16 Weeks of Detraining In general, the little training effect helped participants in G1 to minimize losses in the outcomes during 8 weeks of detraining. In most outcomes, detraining effects were similar in G2 and G3 (Tables 2, 3). G2 (−5%) and G3 (−4%) have maintained traininginduced weight loss after 8 and 16 weeks of detraining. G2 has retained the training-induced increases in vigorous PA at 8 and 16 weeks of detraining compared with G3 and G1, so that at 16 weeks, G2 reported still ∼1 h more (d = 2.22) vigorous PA which decreased below baseline in G3. Detraining did not affect the number of minutes walked per day. The difference in retention of reduced sitting time between G2 and G3 (∼0.5 h) after 8 weeks of detraining increased to ∼1.0 h (p < 0.05, d = 1.27). Detraining did not affect sleep quality. Relative to baseline, the training-induced greater (p < 0.05) benefits were retained similarly (p > 0.05) in G2 and G3 compared with G1 at 8 and 16 weeks of detraining in fall risk (BBS), mobility and balance (SPPB), fast walking speed, and distance walked on the 4-cm-wide beam. G2 compared with G1 and G3 made longer steps while walking on the narrow beam and retained these longer steps at 16 weeks. Recovery heart rates after the step test decreased the most in G2 compared with the other 2 groups, and these reductions were better retained in G2 than G3 at 16 weeks of detraining. Detraining for any duration did not affect performance in DSST, but G2 and G3 versus G1 retained improvements in the Stroop test better, and additional 8 weeks of detraining did not further affect the differences in this test between G2 and G3. Discussion According to the hypotheses, we found that less was more during training and lasted longer after detraining: medium duration dose of exergaming produced the largest clinically meaningful improvements in mobility and selected cognitive tests in 60-year-old office workers with mild mobility limitations and intact cognition. Sample Characteristics at Baseline Near retirement, over 50% of the 160 participants had a current or past medical condition but had normal body and fat mass. Participants' PA level met WHO guidelines, as self-reported daily walking and vigorous PA amounted to 40-50 min (online Appendix 2). Daily sitting time was 8.5 h or ∼60% of wakeful time, substantially lower than the 10.0-10.6 h reported for office workers in other countries. These encouraging shorter daily sitting and longer walking times may be related to the rural setting of the trial that stimulates walking and the use of public transportation. While all outcomes, including IPAQ, were stable over the 8-week-long no-intervention control period (online Appendix 3), the reliability and validity of IPAQ are unclear, as some studies report ∼80% overestimation of PA in community-dwelling adults. The SPPB score of 8.5 suggests limitations in mobility, leg strength, and balance. Yet, the 534-m distance in the 6MWT and the 2.1-m/s fast gait speed suggest no gait impairments relative to the 529 m and 1.9 m/s norms. BBS scores were below 45, indicating no fall risk. A lack of correlation among SPPB, 6MWT, and fast gait speed at baseline (data not shown) indicates that each test measures a unique element of mobility so that these measures are not redundant. Beam walking measures dynamic balance and revealed ceiling effects on the 8-and 12-cm-wide beams. However, the 4-cm-wide beam poses a strong challenge that all participants were still able to manage. We thus strongly recommend future studies to use this test to measure dynamic balance, as the test does not suffer from floor or ceiling effects. Processing speed, inhibition, and memory measures were normal but did not correlate with each other or motor outcomes (data not shown). In sum, despite medical history and mild mobility limitation per SPPB, participants of the present study were apparently healthy community-dwelling office workers late midlife. Training Effects The high-intensity exercise stimulus was designed and proved previously to be effective to reduce body mass and improve mobility, balance, and whole-body sensorimotor coordination in patients with Parkinson's disease, multiple sclerosis, and older adults with severe mobility limitations. In the primary outcome, the 6MWT, the distance walked increased in an inverted U-shaped dose response by 4, 94, and 56 m in G1, G2, and G3 ( Fig. 1; Table 1). The increase in G2 is nearly double of the 50 m clinically meaningful change. A variety of exercise interventions increased 6MWT distance much less, by only 36 m in healthy older adults. Perhaps, the fitness level of those compared with our participants was lower, accounting in part for the much larger improvements we observed. The potency of the exercise intervention in G2 and G3 is signified by the unusually large increases in walking distance notwithstanding participants' relatively high level of daily PA and low daily sitting time at baseline (online Appendix 2). Most secondary outcomes also revealed an inverted Ushaped dose-response pattern favoring G2 (Table 1). Decreases of up to 5% in body mass agree with weight loss reported after multimodal training in adults and can meaningfully reduce risks for cardiovascular disease even in normal-weight adults at mid-late life. The pattern of changes in PA indeed raises questions about the IPAQ's validity because while participants in G2 reported the expected increases in (vigorous) PA in proportion to the 2 per week exercise, that was not the case in G3, reporting in fact less increase in PA than G2 and G1 ( Table 1). The large effect-size changes in G2 followed by G3 in BBS, fast gait speed, and dynamic balance on the 4-m beam provide compelling evidence for the efficacy of the highly challenging exergaming program to improve fall risk, walking capacity, and dynamic balance. Particularly relevant is the ∼1.0 increase in SPPB, freeing participants from mild mobility limitation. Comparing with the 0.1 m change (p > 0.05) in distance walked on the 4-cm beam in the control period (online Appendix 3), G2 walked nearly 1.0 m longer distance on the beams using 1.4 more steps, suggesting that participants in G2 had become more confident and comfortable facing the strong challenge and chose a strategy that kept them on the beam longer. The improvements in mobility-and balance-related outcomes are also encouraging in light of data suggesting no additional improvements in mobility with nutritional supplementation. The present study is highly relevant for exercise prescription. In the context of the study, 2/weekly exergaming seems to be the optimal training volume for improving most outcomes in community-dwelling adults at late mildlife. There are virtually no studies examining the effects of training volume in the context of the present study, but a meta-analytic review concluded that 90-120 min of balance training per week is the most effective in improving overall balance performance in older adults. It is not clear from the present data why once a week exergaming was ineffective. We speculate that intense 3/week exergaming might have caused overtraining because sleep quality decreased more than exercising once or twice a week. This speculation is in line with reductions in exercise and spontaneous PA in G3. An additional novel element of the high-intensity exergaming program is that not only did it improve mobility and balance-related outcomes but it did so by decreasing the cardiovascular load, as the heart rate during the step test decreased 12-17 beatsmin −1. This is an important finding because increased fitness could reduce fatigue, a condition adults report frequently at late midlife. Motor and cognitive function declines with age in a correlated manner due to aging-related changes in brain circuitry, pathology, and molecular fidelity. Our data strengthen the moderate evidence that exergaming can improve cognitive function in healthy adults at late midlife. Of 6 cognitive tests, the exergaming-induced improvements in cognition were limited to DSST and the Stroop test, favoring 2/weekly exergaming (Table 1), suggesting small but significant improvements in inhibition of cognitive interference. These results, while limited, are important because participants were office workers with normal cognition at baseline. The data, however, provide no support for the hypothesized correlated improvements in motor and cognitive function, as we found no association between individual changes in motor and cognitive scores (data not shown). Taken the training data together, we found evidence for the "less (2 vs. 3 per week) is more" phenomenon in the context of the present study. Detraining Effects We observed a dissociation between the inverted Ushaped dose effects of exergaming on the 6MWT, the primary outcome, and the ensuing detraining effects. In ∼72 and ∼50% of the 160 participants, the training effects outlasted the training period for 8 and 16 weeks, respectively (Tables 2, 3). Figure 2 shows that the detraining effects at 8 weeks were related to the magnitude of training gains but not to the dose of exergaming volume and that this effect weakened with additional 8 weeks of detraining (Fig. 2c). That is, regardless of exergaming 1, 2, or 3/ week, the greater the benefits of exergaming were, the better these gains were preserved. The pattern of lasting effects shown in Figure 2 was evident in the secondary outcomes (detailed correlation data not shown). Withdrawal of the training stimulus with respect to balance produced highly inconsistent results. In mobility-limited Parkinsonian patients, the mobility benefits produced by exergaming used here lasted up to 6 months. In some studies, but not in other studies, older adults were able to maintain the strength and balance training-induced functional gains. Because of the high intensity of the sensorimotor stimulus during training, balance in particular, assessed here by fast gait speed, BBS, SPPB, and beam walking, can perhaps be selectively resistant to detraining, agreeing with previously proposed hypotheses. We also observed a robust resistance to detraining for 16 weeks especially in G2 in exergaming-induced increases in fitness through heart rate responses to a standard step test, agreeing with some but not all cardiovascular detraining data. The highly inconsistent de-DOI: 10.1159/000513505 training effects on cardiovascular function might be related to differences between studies in using test intensity (maximal and submaximal), evaluation methods, participants' age, and training intensity, volume, and duration. Training improved DSST and Stroop word-color time only (Table 1). These effects were sustained only in the Stroop test for 8 but not for 16 weeks of detraining, confirming to the dose effect we observed for motor outcomes (Tables 2, 3). These data differ from the only detraining study we are aware of reporting a complete reversal of timed-up-and-go dual task performance to baseline after just 1 month of detraining in chronic strength-trained older women. Exercise cessation reduced exercise-improved quality of life and health, but training and detraining did not affect blood-derived neurotrophic factors in healthy older adults. Taking the current detraining data together, we found evidence that instead of the dose of training volume, training-induced gains per se determined the lasting exergaming effects on motor and only on one cognitive function in adults at late midlife. Limitations One limitation is the short intervention duration. However, when normalized for the number of sessions, outcome gains in longer exergaming studies are often similar to the gains reported in studies as short as the present work, implying a ceiling in the responses to the exercise stimulus. Without a maintenance program, we cannot tell if the intervention-induced gains in mobility could be maintained and slow progression of mobility limitation. The small sample size prevented us to perform sex-stratified analyses. PA was measured by IPAQ instead of a wearable device. The substantial, ∼3.2 kg, reduction in body mass implies that participants might have modified their diet, which we did not monitor. While the 100% adherence and 0% dropout suggest that our participants tolerated well the high exercise intensity, specially trained therapists delivered exercise sessions in a designated hospital facility, conditions unavailable elsewhere. However, participants could follow recent trends and perform agility exercises at home with remote supervision, reducing costs and staff burden. Without neural, physiological, or biomechanical markers, we were unable to determine the mechanisms of adaptations to training and detraining. Perspective Office workers near retirement tend to be sedentary and prone to mobility limitations and illnesses. Because exercise and PA recommendations are universal for broad age categories without considerations for current level of mobility, it is unclear how much exercise could ameliorate the ill effects of sedentariness and reduce mobility limitations at late midlife. How long exercise effects last is another contentious issue, considering the wide range of findings of no residual effects at all and effects that outlasted the training stimulus for several months. The current results inform physicians, therapists, and fitness specialists that less training could be more, with mobility and cognitive improvements lasting for up to 16 weeks after the end of the exercise program in adults at late midlife with mild mobility limitations at baseline. The data expand current guidelines of exercise prescription by showing that twice weekly enjoyable but vigorous exergaming can produce lasting effects on mobility and cognition in initially sedentary adults aged 60 with mild mobility limitations. Conclusion We found that less was more during training and lasted longer after detraining: medium duration dose of exergaming produced the largest clinically meaningful improvements in mobility and selected cognitive tests in 60-year-old office workers with mild mobility limitations and intact cognition. |
def _bad_args(*args):
print(args)
_PARSER.print_help()
sys.exit(0) |
Epidermal growth factor stimulates phospholipase D independent of phospholipase C, protein kinase C or phosphatidylinositol-3 kinase activation in immortalized rabbit corneal epithelial cells. PURPOSE Activation of phospholipase D (PLD) is believed to be an important signaling pathway involved in cell growth and differentiation in several tissues, in response to a variety of mitogens. The aim of the present study was to investigate the effect of epidermal growth factor (EGF) on PLD activity in rabbit corneal epithelial cells (RCEC). We have also examined whether the EGF effect is dependent on concurrent activation of phospholipase C (PLC), protein kinase C (PKC) or phosphatidylinositol 3-kinase (PI-3-kinase) in these cells. METHODS RCEC, immortalized with adenovirus SV-40, were cultured until they became confluent. The cells were labeled with myristic acid and incubated with or without EGF or other agents for specified time intervals. PLD activity was measured by quantifying phosphatidylethanol in cells incubated in the presence of ethanol. PLC activity was determined by measuring the radioactivity in inositol trisphosphate in myoinositol-labeled RCEC. PI 3-kinase activity was assessed by measuring the production of PIP3 in 32P-labeled cells. RESULTS Addition of EGF to RCEC stimulated PLD activity in a time- and dose-dependent manner. The maximal effect was observed with 150 ng/ml EGF and at 10 min of incubation. The PLD activity was also stimulated when phorbol myristate acetate (PMA) was added to the cells. Treatment of the cells with EGF stimulated PLC activity which was inhibited by U73122, a PLC inhibitor. Under the same experimental conditions, the inhibitor had no effect on EGF-stimulated PLD activity. Down-regulation of PKC or treatment of the cells with RO31-8220, a PKC inhibitor, inhibited the PMA- but not EGF-stimulated PLD activity. Incubation of the cells with wortmannin, a PI 3-kinase inhibitor, abolished the EGF-stimulated PI 3-kinase activity, but potentiated the EGF-stimulated PLD activity. The EGF effect was inhibited by treatment of the cells with tyrphostin B42, a receptor tyrosine kinase inhibitor. CONCLUSIONS These results indicate that EGF stimulates PLD activity in RCEC by a mechanism that involves tyrosine phosphorylation of a protein(s) in the cascade of biochemical reactions initiated by EGF-receptor interaction, and it is not dependent on concurrent activation of PKC, PLC, or PI 3-kinase in these cells. |
<filename>Trees/BinaryTree/VerticalWidth_of_tree.java
private class help {
int max = Integer.MIN_VALUE;
int min = Integer.MAX_VALUE;
}
public void verticalWidth() {
help h = new help();
verticalWidth(this.root, h, 0);
System.out.println(h.max + Math.abs(h.min));
}
private void verticalWidth(Node node, help h, int curr) {
if (node == null) {
return;
}
verticalWidth(node.left, h, curr - 1);
if (h.min > curr) {
h.min = curr;
}
if (h.max < curr) {
h.max = curr;
}
verticalWidth(node.right, h, curr + 1);
}
|
REED CITY — In many ways, the St. Mary and Climax-Scotts basketball programs are very similar.
Both the Snowbirds and Panthers have been on a steady rise the past three years and both are in the middle of a breakthrough.
The two teams will collide at 7 p.m. Tuesday in the Class D Quarterfinal in Reed City with a trip to Michigan State University’s Breslin Center on the line.
Both teams have played in three straight Regional Finals. For Climax-Scotts (19-5), the breakthrough was winning the Regional. After losing that game the past two years, the Panthers got it done Thursday with a 41-28 victory against Fulton-Middleton.
St. Mary, on the other hand, has increased its performance in each of the three years, advancing to the Regional Finals in 2011 and the Quarterfinals both in 2012 and this season.
Last season, the Snowbirds led for much of the way but lost to Crystal Falls Forest Park 59-57 in the type of game where both teams left with their heads held high.
St. Mary (23-2) would love to extend its season and move on to the Final Four for the first time since 2002.
To do so, they’ll have to get past a Panthers team that has played very well in the tournament. Climax-Scotts defeated Battle Creek Calhoun Christian, 70-23, and Martin, 51-29, in the Districts before downing Hudsonville Freedom Academy, 50-13, and Fulton-Middleton in the Regionals.
The Panthers feature a balanced scoring attack with three players — Stephanie Cochran, Destiny Froberg and Fallon Froberg — leading the team in scoring during the tournament. Janae Langs is also a dangerous scorer for Climax-Scotts.
St. Mary saw its biggest test in the tournament to date with a 39-23 victory against Bear Lake in Thursday’s Regional Final. Other than that, it’s been smooth sailing for the Snowbirds, who won their first four games by an average margin of more than 30 points.
The emphasis likely will be on stopping Snowbird junior guard Kari Borowiak. That was the case against Bear Lake, which held the 1,000-point career scorer to just six points. But it was the other players, like seniors Mary Spyhalski, Christina Smith, Sarah Long and Jada Bebble, who stepped up, proving this St. Mary team has five viable weapons.
The winner of this game will face the winner of No. 1-ranked St. Ignace and Crystal Falls Forest Park in the Class D State Semifinals Thursday at 6 p.m. |
SEATTLE — Anyone who has ever tried to return something at a store knows it can sometimes be stressful.
But when Kelly Blue Kinkel tried to return a coat she ordered before Christmas, she got an entirely different experience.
Kinkel shared the story on Facebook. In her post, she said she had one of the best customer service experiences of her life after deciding to return the coat to Seattle-based company Zulily.
“I called customer service and asked how to return the unopened coat for a refund. I spoke with a sweet young man named Patrick, and he let me know he would refund my money immediately. I asked again how to send it back, and he said, ‘Please don’t send it back. If you know someone who needs a winter coat or if you would like to donate it to a charity, that would make us very happy.'”
Kinkel thought he was joking.
But when she realized he was serious, she was brought to tears.
Kinkel says she’s now loyal Zulily customer.
“I just don’t know other companies that do this, do you? I thought Zulily was pretty incredible before, but after today, I’m a customer for LIFE. The world needs more LOVE like that. Honest business. Honest ethics. How refreshing!”
Kinkel’s story has been shared more than 27,000 times.
You can read the whole post below: |
The present invention relates to a method of growing a p-type ZnO based oxide semiconductor layer in which a p-type ZnO based oxide semiconductor layer is grown with a high carrier concentration and a method of manufacturing a semiconductor light emitting device using the same. More specifically, the present invention relates to a method of growing a p-type ZnO based oxide semiconductor layer in which the acceptor level of a p-type dopant is reduced and p-type dopants are doped to fully act as acceptors, thereby sufficiently increasing a carrier concentration thereof and a method of manufacturing a semiconductor light emitting device using the same.
A blue color based (a wavelength region from ultraviolet to yellow) light emitting diode (hereinafter referred to as an LED) to be used for a full color display, a signal light or the like and a blue laser diode (hereinafter referred to as an LD) for a very fine DVD light source for a next generation which continuously oscillates at a room temperature can be obtained by laminating GaN based compound semiconductor layers on a sapphire substrate and have recently attracted the attention. While the GaN based compound semiconductor is in major in the light emitting device having a short wavelength, it has also been investigated that a II-VI compound semiconductor such as ZnO is used. The ZnO has a band gap of 3.37 eV at a room temperature and it has also been expected that the ZnO based oxide can be applied to a transparent conductive film, a transparent TFT and the like in addition to the DVD light source.
In ZnSe, a p-type semiconductor layer to be the II-VI compound has been implemented by activating a nitrogen gas using a plasma and doping the activated nitrogen. However, the same method has been tried for the ZnO and a p-type layer having a high carrier concentration has not been implemented. For example, it is apparent that an N concentration obtained by SIMS when growing a ZnO layer while supplying a ZnO material and plasma nitrogen to be a p-type dopant at a high substrate temperature of 500 to 60xc2x0 C. is very small like a noise as shown in FIG. 9 together with a secondary ion strength of ZnO. In FIG. 9, there is a portion in which the N concentration has a maximum value based on the fact that a substrate is once taken out of a growing apparatus in order to grow an undoped ZnO layer and an N-doped p-type layer and to recognize a boundary thereof. However, it is apparent that the N concentration is rarely varied between the undoped layer and the p-type layer. The reason why the N concentration is very noisy in FIG. 9 is that a concentration is low and close to the limit of the measurement based on the SIMS.
Although the reason is not definite, for example, it has been published that nitrogen entering an oxygen site of ZnO (the condition of p-type conduction) creates a deep acceptor level of approximately 200 meV, and furthermore, makes crystal structure unstable and generates an oxygen hole so that doping of ZnO with nitrogen becomes hard in xe2x80x9cSolution using a codoping method to Unipolarity for the fabrication of p-type ZnOxe2x80x9d (Japanese Journal of Applied Physics, Vol. 38, pp. 166 to 169, 1999) written by T. Yamamoto et al. As one of the solutions, the paper has proposed a codoping method for simultaneously doping nitrogen to be an acceptor and a III group element to be a donor. More specifically, there have been described the effect of mutually bonding a III group element and nitrogen through codoping to enter into a ZnO crystal lattice, thereby preventing the instability of crystals from being caused by nitrogen doping and the effect of reducing the acceptor level.
As described above, it has been proposed that a III group element such as Ga to be an n-type dopant is simultaneously doped in addition to nitrogen to be a p-type dopant in order to form the p-type ZnO based oxide semiconductor layer. However, there is a problem in that a p-type layer having a high carrier concentration cannot be obtained even if the nitrogen and the III group element such as Ga are actually doped simultaneously. In particular, although the present inventors have found that a residual carrier concentration is reduced when a ZnO based oxide is grown at a high temperature of 500xc2x0 C. or more, during a high temperature epitaxial growth, particularly, an oxidation speed is higher than a nitrogenization speed. Therefore, there is a problem in that Ga is more doped than main N even if the simultaneous doping is carried out, as shown in FIGS. 7 and 8 showing the concentrations of Ga and N obtained by the simultaneous doping at 600xc2x0 C. FIG. 7 shows that a larger amount of Ga is doped than that in FIG. 8 and N is also doped more easily if the amount of Ga to be doped is increased. However, the concentration of N does not exceed that of Ga.
In consideration of the circumstances, it is an object of the present invention to provide a method of growing a p-type ZnO based oxide semiconductor layer capable of doping N to be a p-type dopant at a stable carrier concentration and sufficiently increasing the carrier concentration of the p-type layer made of ZnO based oxide semiconductor, by employing a simultaneous doping method in high temperature growth in which a residual carrier concentration can be reduced.
It is another object of the present invention to provide a method of manufacturing a semiconductor light emitting device which can grow a p-type ZnO based oxide semiconductor layer having a high carrier concentration, thereby obtaining a semiconductor light emitting device such as a light emitting diode or a laser diode which is excellent in a light emitting efficiency.
The present inventors investigated to solve the reason why a p-type ZnO based oxide semiconductor layer having a sufficiently high carrier concentration cannot be obtained by codoping. As a result it was found that the chemical activity of oxygen is very high on the condition that Zn, O, N to be a p-type dopant and Ga to be an n-type dopant coexist and grow, for example, so that a reaction of ZnO and GaO is caused much more early than that of ZnN and GaN.
In other words, the following is apparent from the theory of the above-mentioned paper. Even if N alone enters into the site of O of a ZnO crystalline structure, the crystalline structure becomes unstable or an acceptor level becomes too deep, which is not preferable. By doping Ga, a Ga-N bond is formed and if the amount of N becomes larger than that of Ga, the doped N can effectively act as an acceptor with an xe2x80x94Nxe2x80x94Gaxe2x80x94Nxe2x80x94 bond. However, even if N and Ga are simply supplied, the reaction of ZnO and GaO proceeds early so that the xe2x80x94Nxe2x80x94Gaxe2x80x94Nxe2x80x94 bond cannot be obtained and Ga is replaced with Zn to obtain an xe2x80x94Oxe2x80x94Znxe2x80x94Oxe2x80x94Gaxe2x80x94Oxe2x80x94 structure and to act as an n-type dopant. Therefore, the p-type layer is adversely affected.
The present inventors found that at least the supply of an O material is stopped to carry out the doping when supplying a Ga material so that an xe2x80x94Nxe2x80x94Gaxe2x80x94Nxe2x80x94 bond is obtained and is further combined with O of a ZnO semiconductor layer and N is bonded to Ga with a bond of xe2x80x94Oxe2x80x94Znxe2x80x94Nxe2x80x94Gaxe2x80x94Nxe2x80x94Znxe2x80x94Oxe2x80x94 so that a p-type ZnO semiconductor layer acting as an effective acceptor can be obtained.
The present invention provides a method of growing a p-type ZnO based oxide semiconductor layer wherein when a p-type dopant material made of N and an n-type dopant material are to be simultaneously supplied to grow the p-type ZnO based oxide semiconductor layer, at least the supply of O in raw materials constituting a ZnO based oxide is stopped when supplying the n-type dopant material, and thereby carrying out growth.
The ZnO based oxide semiconductor means an oxide containing Zn, and specifically includes an oxide of IIA group and Zn, IIB group and Zn or IIA and IIB groups and Zn in addition to ZnO, respectively.
By using the method, as described above, the bond of Zn or the III group element and O is suppressed and the bond of N to be the p-type dopant and the III group element such as Ga to be the n-type dopant is obtained and is replaced with a ZnO based crystal, and therefore acts as an effective p-type dopant having a shallow acceptor level.
More specifically, the growth can be carried out by repeating a step of growing a ZnO based oxide semiconductor layer while supplying the p-type dopant material made of N together with the raw materials constituting the ZnO based oxide and a step of stopping supply of at least O in the raw materials constituting the ZnO based oxide, and supplying the n-type dopant material made of a III group element.
In another method the growth can be carried out by repeating a step of growing a ZnO based oxide semiconductor layer by supplying the raw materials constituting the ZnO based oxide without supplying any dopant materials and a step of stopping supply of at least O in the raw materials constituting the ZnO based oxide, and supplying the p-type dopant material made of N and the n-type dopant material made of the III group element.
It is preferable that a time required for supply or an amount of the supply should be regulated such that the p-type dopant material made of N is more supplied than the n-type dopant material made of the III group element.
The present invention provides a method for manufacturing a semiconductor light emitting device, in which semiconductor layers made of a ZnO based oxide semiconductor having at least an n-type layer and a p-type layer are provided on a surface of a substrate so as to form a light emitting layer forming portion, wherein the p-type layer is formed by the method according to any methods described above. |
US Intelligence Officer Lt Col Tony Shaffer: NSA Officials Who Hated Clintons Hacked DNC and Gave Emails to WikiLeaks
Guest post by Joe Hoft
Lieutenant Colonel Tony Shaffer (ret.) went on with Hannity on Fox News to discuss the latest leaks by Wikileaks.
Shaffer told Hannity former NSA operatives who were fed up with the Clintons are the ones who hacked into the DNC and gave the hacked Podesta emails to WikiLeaks. The Democrats do not want to talk about this and it is probably why the DNC refused to allow the FBI to look at it’s hacked server. Shaffer said:
Tony Shaffer: Sean, we did it. Not me, but our guys, former members of NSA, retired intelligence officers used these tools to break in there and get the information out. That’s what the Democrats don’t want to talk about because it doesn’t fit their narrative.
Via Hannity:
Hannity said it was worse than any spy novel he can think of.
Hat tip William H |
PHARMACEUTICALLY VALUABLE BIOACTIVE COMPOUNDS OF ALGAE Pharmaceutically valuable products from microalgae and its industrial commercialization today is still in its infancy and can be seen as a gateway to a multibillion dollar industry. Microalgae generally grow autotrophically and are ubiquitous in nature. They represent a major untapped resource of genetic potential for valuable bioactive agents and fine biochemical. This proven ability of microalgae to produce these compounds places these microorganisms in the biotechnological spotlight for applications and commercialization as in the pharmaceutical industry. The production of microalgal metabolites, which stimulate defense mechanisms in the human body, has spurred intense study of the application of microalgal biomass and products thereof in various food preparations, pharmacological and medical products. There is, therefore, a huge scope for further study of the identified algal compounds and their activities in the treatment and prevention of various diseases, in addition to an ongoing search for other, as yet undetected, metabolites. INTRODUCTION The markets for both pharmaceuticals and nutraceuticals are growing quickly worldwide, and it is this global scope that particularly attracts consumers. A growing proportion of today's promising pharmaceutical research focuses on the production of potent bioactive compounds from algae. Thus, the untapped potential of algae in the field of pharmaceuticals has to be still explored to grow and capitalize on tremendous global marketing opportunities. Algae are emerging as one of the most promising sources of sustainable crops with potential health benefits including protein, Omega 3, and antioxidants. The pharmaceutical potential of the large variety of algae species is just starting to be explored. A lot of research aims to enhance particular pigments and products within certain algae species that have nutritional, nutraceutical, or pharmaceutical value. While the pharmaceutical content in the common baseline algae strains is small, current market values for these products are extremely high. The major products currently being commercialized or under consideration for commercial extraction include carotenoids, phycobilins, fatty acids, polysaccharides, vitamins, sterols, and biologically active molecules for use in human and animal health. There is a range of pharmaceutical products derived from algae. Some of them include: Antimicrobials, antivirals and antifungals, neuroprotective products, therapeutic proteins, and drugs. WHAT ARE BIOACTIVE COMPOUNDS? Bioactive compounds are physiologically active substances with functional properties in the human body. There is, therefore, great enthusiasm for the development and manufacture of various biocompounds that can potentially be used as functional ingredients such as carotenoids, phycocyanins (PC), polyphenols, fatty acids, and polyunsaturated compounds. An interest in the production of bioactive compounds from natural sources has recently emerged, driven by a growing number of scientific studies that demonstrate the beneficial effects of these compounds on health benefits. These natural products are important in the search for new pharmacologically active compounds and play an important role in new drug discovery for the treatment of human diseases. Many clinically viable and commercially available drugs with antitumor and anti-infective activity originated as natural products. MICROALGAE AS A SOURCE OF BIOACTIVE COMPOUNDS Algae, in general, are found all over the globe and in every ecological niche conceivable. They, therefore, have unique properties to help them survive even in adverse conditions they encounter in the ecosystem. These unique attributes are brought about by changes in their macro-and micro-molecular constituents in the cell which are formed under the stressed situations the algae faces. These unique metabolites often have special properties and can be considered as bioactive compounds in addition to the macromolecules the algae generally have. There are thousands and thousands of algal species and only 25-30% of them have been identified and collected. Hence, there is a huge unexplored resource available to be exploited in the pharmaceutical industry. Microalgae are known to produce various therapeutically effective biocompounds that can be obtained from the biomass or released extracellularly into the medium. These microorganisms contain many bioactive compounds such as proteins, polysaccharides, lipids, vitamins, enzymes, sterols, and other high-value compounds with pharmaceutical and nutritional importance that can be employed for commercial use. TYPES OF BIOACTIVE COMPOUNDS Bioactive compounds from microalgae can be obtained directly from primary metabolisms, such as proteins, fatty acids, vitamins, and pigments, or can be synthesized from secondary metabolism. Such compounds can present antifungal, antiviral, antialgal, antienzymatic, or antibiotic actions. Many of these compounds (cyanovirin, oleic acid, linolenic acid, palmitoleic acid, vitamin E, B12, -carotene, PC, lutein, and zeaxanthin) have antimicrobial, antioxidant, and antiinflammatory properties, with the potential for the reduction and prevention of diseases. In most microalgae, the bioactive compounds are accumulated in the biomass; however, in some cases, these metabolites are excreted into the medium; these are known as exometabolites. anticancer activity. These studies have been based on the extraction of bioactive compounds from these microalgae. The prokaryotic blue-green algae or cyanobacteria are known to produce intracellular and extracellular metabolites with potential biological activities, such as antibacterial, antifungal, antiviral, antitumor, antihuman immunodeficiency virus (HIV), anti-inflammatory, antioxidant, antimalarial, herbicidal, and immunosuppressant effects. The therapeutic importance of Spirulina, one of the most extensively studied blue-green algae, has been reported in several studies. These include its use in the treatment of hyperlipidemia, cancer, HIV, diabetes, obesity, and hypertension, the improvement of the immune response in renal protection against heavy metals and drugs, and the reduction in serum levels of glucose and lipids, among others. Nostoc, another blue-green alga, biomass has been used in the medical field and as a dietary supplement because of its protein, vitamin, and fatty acid content. The medical value of this microalga was established by its use in the treatment of fistula and for some forms of cancer. Historically, the biomass of Nostoc is described as anti-inflammatory, and it is also found to aid in digestion, blood pressure control, and immune boosting. Cyanovirin, a potential protein molecule produced by a Nostoc species, showed a positive effect in the treatment of HIV and influenza A (H1N1). Nostoc species also contains a spectrum of polyunsaturated fatty acids (PUFAs) that include essential fatty acids such as linoleic, -linolenic, -linolenic, octadecatetraenoic, and eicosapentaenoic acid. Essential fatty acids are precursors of prostaglandins, attracting significant interest from the pharmaceutical industry. Several other studies suggest that Nostoc produces compounds with antimicrobial, antiviral, and anticancer activity. These results have encouraged its cultivation on a large scale, and it has great economic potential due to its nutritional and pharmaceutical importance and the pharmaceutical industry. Chlorella, a very common green alga was discovered by the Japanese, traditional consumers of algae, who usually eat and enjoy it as a food supplement. Chlorella is rich in chlorophyll, proteins, polysaccharides, vitamins, minerals, and essential amino acids with molecular constituents of 53% (w/w) protein, 23% (w/w) carbohydrate, 9% (w/w) lipids, and 5% (w/w) minerals and oligo elements. These nutrient concentrations can be varied by manipulation of the culture conditions, in which they are grown. The biomass of Chlorella is also rich in vitamin B complex, especially B12, which are vital in the formation and regeneration of blood cells. Like Spirulina, Chlorella has a GRAS certificate issued by the FDA and can thus be used as food without risk to human health when grown in a suitable environment with proper hygiene and good manufacturing practices. The pharmaceutical importance of Chlorella is attributed to its medicinal properties. There is ample experimental evidence of its antitumor, anticoagulant, antibacterial, antioxidant, and antihyperlipidemia effects in addition to a hepatoprotective property and the immune-stimulatory activity of enzymatic protein hydrolyzate. Many antioxidant compounds are thought to be responsible for Chlorella functional activities. Antioxidants such as lutein, -carotene, -carotene, ascorbic acid, and -tocopherol, which are active against free radicals, have been identified. Some of these compounds not only are important as natural colorants or additives but also may be useful in reducing the incidence of cancer and in the prevention of macular degeneration. By far one of the most important bioactive compounds in Chlorella is -1,3 glucan, an active immune stimulator that reduces free radicals and blood cholesterol. The efficacy of this compound against gastric ulcers, sores, and constipation has been reported. It also has been demonstrated to have preventive action against atherosclerosis and hypercholesterolemia, as well as antitumor activity. Chlorella is produced by more than 70 companies. Taiwan Chlorella Manufacturing Co. (Taipei, Taiwan) is the world's largest producer of Chlorella, with over 400,000 tons of biomass produced per year. The significant production also occurs in Kltze (Germany) (80-100 t year −1 of dry biomass). Dunaliella is also a green unicellular halotolerant microalga that has been extensively studied for its pharmaceutically active compounds. This microalga is popularly studied as an extremophile, unique physiology, and therefore, many biotechnological applications. Dunaliella is a source of carotenoids, glycerol, lipids, and other bioactive compounds such as enzymes and vitamins. This microalga is a major source of natural -carotene, able to produce up to 14% of its dry weight under conditions of high salinity, light, and temperature as well as nutrient limitation. In addition to -carotene, this microalga is rich in protein and essential fatty acids, which can be consumed safely, as evidenced by GRAS recognition. Compounds in the Dunaliella biomass have various biological activities such as antioxidant, antihypertensive, bronchodilatory, analgesic, muscle relaxant, hepatoprotective, and antiedemal properties. The microalgal biomass can also be used directly in food and pharmaceutical formulations. Chang et al. showed that Dunaliella cells contained antibiotic substances. According to these authors, the crude extract of this microalga strongly inhibited the growth of Staphylococcus aureus, Bacillus cereus, Bacillus subtilis, and Enterobacter aerogenes. In another study, Dunaliella showed antibacterial activity against other microorganisms of importance to the food industry, which includes Escherichia coli, Candida albicans, and Aspergillus niger. Under ideal growing conditions, Dunaliella can be stimulated to produce approximately 400 mg of -carotene per square meter of growing area. The cultivation of Dunaliellafor, the production of -carotene, has been conducted in several countries, including Australia, India, Israel, the USA, and China. An ingredient of Dunaliella with a strong ability to stimulate cell proliferation and improve the energy metabolism of the skin was released by Pentapharm (Basel, Switzerland). New pilot plants are under development in India, Chile, Mexico, Cuba, Iran, Taiwan, Japan, Spain, and Kuwait. ALGAL MACROMOLECULES AS BIOACTIVE COMPOUNDS AND THEIR PHYSIOLOGICAL EFFECTS Oxidative damage caused by reactive oxygen species (ROS) to lipids, proteins, and nucleic acids can cause many chronic diseases such as heart disease, atherosclerosis, cancer, and aging. In general, microalgal strains are considered a rich source of antioxidants, with potential applications in pharmaceuticals, food, and cosmetics. Antioxidant compounds, such as dimethylsulfoniopropionate and mycosporine amino acids, were isolated from microalgae and are potent chemical blockers of UV radiation. In addition to these compounds, pigments, lipids, and polysaccharides with antioxidant activity can also be found in microalgae. Good examples of such compounds are the C-PC is a blue photosynthetic pigment that belongs to the group of phycobiliproteins found in large quantities in the cyanobacteria, Rhodophyta, and Cryptophyte. PC has applications as a nutrient and natural food colorants and cosmetics. In addition, it has application in medical diagnostics and pharmacology in the detection of cancer, and therefore, of great pharmaceutical importance. It is usually extracted from the biomass of Spirulina, Porphyridium cruentum and Synechococcus. Among the carotenoid compounds, -carotene and astaxanthin are prominent. These compounds have application in the food and pharmaceutical industries because of their antioxidant properties and pigmentation ability. Polysaccharides represent a class of high value-added components with applications in pharmaceuticals, food, cosmetics, fabrics, stabilizers and emulsifiers. Microalgal polysaccharides contain sulfate esters, are referred to as sulfated polysaccharides, and possess unique medical applications. The basic mechanism of therapeutic action is based on the stimulation of macrophages and modulation. The biological activity of sulfur polysaccharides is linked to their sugar composition, position, and degree of sulfation. Some studies have reported that sulfated polysaccharides derived from microalgae inhibit viral infection, such as encephalomyocarditis virus, Herpes simplex virus types 1 and 2 (HSV1, HSV2), HIV, hemorrhagic septicemia in salmonid virus, swine fever virus, and varicella virus. Carrageenan is a sulfated polysaccharide that can directly bind to human papillomavirus to inhibit not only the viral adsorption process but also the input and the subsequent process of the uncoating of the virus. The importance of polysaccharides in the pharmaceutical industry lies in the fact that the extraction of this compound is relatively easy from microalgae. The lipid compositions of microalgae are found to be responsible for its antimicrobial activity. This antimicrobial property of microalgae is because of their potential to produce compounds such as -and -ionone, -cyclocitral, neophyte diene, and phytol 55. Antimicrobial activity against human pathogens, such as E. coli, Pseudomonas aeruginosa, S. aureus, and Staphylococcus epidermidis, has been attributed to -linolenic acid, eicosapentaenoic acid, hexadecatrienoic acid, docosahexaenoic acid, palmitoleic acid, lauric acid, oleic acid, lactic acid, and arachidonic acid. Microalgae produce several anti-inflammatory compounds in their biomass that may exert a protective function in the body when consumed as food or used as pharmaceuticals and cosmetics. Because of their anti-inflammatory properties, microalgal biomass is being considered for applications in tissue engineering for the development of scaffolds, for use in the reconstitution of organs and tissues. This is an important application for humans, especially in patients with burns in which the skin was completely lost. Microalgal compounds with such properties are the long-chain PUFAs, sulfurized polysaccharides, and pigments. Many microalgal polysaccharides possess the ability to modulate the immune system through the activation of macrophage functions and the induction of ROS, nitric oxide, and various other types of cytokines/chemokines. Macrophages are able to regulate several innate responses and secrete cytokines and chemo-cytokines that serve as signals for immune and inflammatory molecular reactions. Sulfated polysaccharides with anti-inflammatory activity can be applied in skin treatments inhibiting the migration and adhesion of polymorphonuclear leukocytes. In humans, the oxidation reactions driven by ROS can lead to irreversible damage to cellular components, including lipids, proteins, and DNA degradation and/or mutation. Consequently, this damage can lead to several syndromes such as cardiovascular disease, some cancers, and the degenerative diseases of aging. Pigments derived from microalgae have neuroprotective properties, being valuable sources as functional ingredients in pharmaceutical products that show efficient action in the treatment and/or prevention of neurodegenerative diseases. Vitamin E derived from algae has preventive effects for many diseases, such as atherosclerosis and heart disease, as well as neurodegenerative diseases, such as multiple sclerosis. Carotenoids have great potential benefits to human health, including the treatment of degenerative diseases, such as macular degeneration and cataract development. These compounds act as antioxidants, reducing oxidative damage by ROS. Studies indicated that increased intake of phenols decreased the occurrence of degenerative diseases. Phenolic compounds from microalgae with the potential to fight free radicals have been reported. Scientific findings indicate astaxanthin for multimodal intervention for many forms of degenerative diseases, including cardiovascular diseases, cancer, metabolic syndrome, cognitive impairment, age-related immune dysfunction, stomach and ocular diseases (macular degeneration, cataract, glaucoma, diabetic retinopathy, and retinitis pigmentosa), and skin damage. High levels of lycopene from algae in plasma and tissues were inversely related to coronary heart disease, myocardial infarction, and the risk of atherosclerosis. CONCLUSION Bioactive metabolites of microalgal origin are of special interest in the development of new products for pharmaceutical, cosmetic, and food industries. Further research should be conducted with these bioactive compounds to verify their beneficial effects for humans, their degradability when released into the environment, and their effects when used in animals. Pharmaceutically valuable products and its industrial commercialization today are still in its infancy and can be seen as a gateway to a multibillion dollar industry. Scientists have just started to tap the enormous biological resource and physiological potential of microalgal species growing in all ecological niches. In recent years, innovative processes and products have been introduced in both macro-and microalgal biotechnology. One can expect that future trends in the involvement of microalgal utilization in the pharmaceutical industry will lead to a diversity of technical solutions for the use of PBR for cultivating microalgae. These will be adapted to the autecological demands of strains and to application aims for biomass, valuable substances, and ecology. An exhaustive inventory of species in all regions accompanied by proper taxonomic handling and strain collection could be a basis for future success. While the use of microalgae in functional foods and animal feed could soon reach the level of mass products, their use in pharmaceutical applications appears to be developing very rapidly. |
. Neuroprotective effects of magnesium (Mg(2+)) have been shown in experimental studies by using animal models of the cerebral ischemia and cerebral contusion. We review the neuroprotective effects of Mg(2+) in the histological and behavioral studies. In the animal models of cerebral ischemia, Mg(2+) treatment reduces the infarct volume, inhibits the neuronal death, and attenuates the motor impairments. In the animal models of cerebral contusion, Mg(2+) treatment inhibits the neuronal death and edema, and attenuates not only the motor impairments but also cognitive dysfunction. |
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package javax.naming;
/**
* A <code>StringRefAddr</code> refers to an address which is represented by a
* string such as a URL or hostname.
*
*/
public class StringRefAddr extends RefAddr {
/*
* -------------------------------------------------------------------
* Constants
* -------------------------------------------------------------------
*/
/*
* This constant is used during deserialization to check the J2SE version which
* created the serialized object.
*/
static final long serialVersionUID = -8913762495138505527L; //J2SE 1.4.2
/*
* -------------------------------------------------------------------
* Instance variables
* -------------------------------------------------------------------
*/
/**
* The address itself.
* For StringRefAddr the address is a string such as a URL or hostname.
*
* @serial
*/
private String contents;
/*
* -------------------------------------------------------------------
* Constructors
* -------------------------------------------------------------------
*/
/**
* Constructs a <code>StringRefAddr</code> object using the supplied
* address type and address.
*
* @param type the address type which cannot be null
* @param address the address itself which may be null
*/
public StringRefAddr(String type, String address) {
super(type);
this.contents = address;
}
/*
* -------------------------------------------------------------------
* Methods override parent class RefAddr
* -------------------------------------------------------------------
*/
/**
* Get the string containing this address.
*
* @return a string containing this address which may be null
*/
@Override
public Object getContent() {
return contents;
}
}
|
sRAGE alleviates SARS-CoV-2-induced pneumonia in hamster Dear Editor Since last year, the most demanding task of the global pharmaceutical community has been focused on the development of strategies to treat coronavirus disease 2019 (COVID-19). To date, several vaccines have been developed and already demonstrated their efficacy in reducing the incidence of COVID19. However, the development of drugs treating COVID-19 is lagging far behind, and all the current treatment regimens have their limitations. The lung is the major and usually the initial organ to be attacked by severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2). Therefore, putting lung inflammation under control is expected to alleviate whole-body inflammatory responses and inflammation-induced organ damage. In principle, the inflammatory responses incurred by SARS-CoV2 involve diversified upstream stimuli, multiple signaling pathways, and numerous downstream effectors. Therefore, targeting any particular cytokine or even signaling pathway may not be sufficient in alleviating the systemic overreaction of immune system in severe COVID-19, namely cytokine storm. It is urgent and pivotal to develop means which can systemically tuning down the exaggerated immune responses via a relative central node in the inflammation signaling cascades. The receptor for advanced glycation endproducts (RAGE) is a multiligand, pro-inflammatory pattern recognition receptor that is implicated in both infectious and sterile inflammatory conditions. Upon ligand binding to the receptor, it transduces signals via several downstream kinases, including MAPKs, PI3K/Akt, and JAK, which in turn, activate transcription factors NF-B, AP-1, and Stat3. These transcription factors promote the expression of important cytokines, such as TNF-, IL-1, and IL-6. Remarkably, RAGE is almost exclusively expressed in the lung and involved in multiple lung diseases. Soluble RAGE (sRAGE) is a splicing variant or a post-translationally cleaved short form of RAGE which lacks the transmembrane and intracellular C-terminal domain, thus serving as a decoy receptor to attenuate inflammatory responses initiated by the full-length RAGE. However, it is unknown whether RAGE signaling plays a role in SARS-CoV-2-induced pneumonia, and if so, whether sRAGE can be applied as a therapeutic agent to treat COVID-19. In order to investigate the role of RAGE/sRAGE in SARS-CoV-2induced pneumonia, we used a SARS-CoV-2-inoculated hamster model of COVID-19 (Supplementary Fig. S1a) with RAGE highly expressed in the lung (Supplementary Fig. S1b). For therapeutic intervention, the infected hamsters were treated with sRAGE or human serum albumin (HSA) (Supplementary Fig. S1c), starting from day 1 post-inoculation (1dpi). SARS-CoV-2 inoculation caused severe pneumonia in the hamsters, and sRAGE treatment profoundly mitigated SARS-CoV2-induced pneumonia (Fig. 1ac) and significantly delayed body weight loss (F= 2.363, P= 0.025) (Supplementary Fig. S1d), although the RAGE expression and the viral RNA loads in lung tissues were not different between sRAGEand HSA-treated groups (Supplementary Fig. S1e, f). Remarkably, sRAGE significantly reduced the diffused thickened alveolar septum, multifocal exudation, and accumulation of inflammatory cells in the perivascular and peribronchial spaces in the lung (Fig. 1a, b) and sRAGE treatment also reduced the number of hamsters with severe interstitial pneumonia (Fig. 1c). Furthermore, immunohistochemical staining demonstrated that SARS-CoV-2-induced recruitment of CD3-positive T cells and the expression of myxovirus resistance protein 1 (Mx1, also known as Mx2 in hamster) in the lung were greatly reduced by sRAGE treatment (Fig. 1d), suggesting alleviated lung inflammation induced by SARS-CoV-2. Consistently, quantitative reverse transcription polymerase chain reaction (RT-qPCR) showed that the mRNA levels of the macrophage marker CD68, and inflammatory disease markers including IFIT3 and Mx1, and inflammatory cytokines including IL1, IL-6, TNF-, IL-18, and IL-10, as well as ICAM1, were induced by virus infection, while the induction was significantly repressed by the treatment with sRAGE (Fig. 1e and Supplementary Fig. S1g). These results corroborate that sRAGE can effectively suppress SARS-CoV-2-triggered pneumonia. To systematically analyze the molecular pathogenesis in the lung upon SARS-CoV-2 infection and sRAGE treatment, we used stable isotope-labeled proteomics analysis (TMTpro, 16plex) to profile the whole proteome of the lung tissues. The list of the proteins identified and analyzed are shown in Supplementary data S1. The hierarchical clustering analysis separated the sRAGE-treated samples from those untreated or treated with HSA (Fig. 1f). Thus, in the following analysis, the samples from untreated or HSA-treated groups were combined into one group (labeled as infection), and the samples from sRAGE-treated animals were labeled as sRAGE. Consistently, principal component analysis (PCA) also separated the samples into three groups based upon their proteomic profiles (Supplementary Fig. S2a). Importantly, sRAGE treatment attenuated the increases of 74.7% (408 out of 546) of the SARS-CoV-2 infectionupregulated proteins (Supplementary Fig. S2b and Supplementary data S1sheet 4). Several major pathways affected by virus infection were identified using Reactome or KEGG analysis of the 546 upregulated proteins (Supplementary Fig. S2c, d and Supplementary data S1sheet 5, 6). Most of the upregulated proteins involved in inflammation and DNA replication upon SARS-CoV-2 infection were downregulated by sRAGE treatment according to the recovery score (Supplementary Fig. S2e, f). The infection of SARS-CoV-2 instigated profound inflammatory responses in the lung, as evidenced by the upregulation of multiple inflammation-related proteins, while most of these changes were ameliorated by sRAGE treatment (Fig. 1g). Immunohistochemical staining in lung tissues revealed that the sRAGE treatment resulted in the downregulation of total and phosphorylated p65 transcription factors and their nuclear localization (Supplementary Fig. S3a), as well as reduced signal intensity of total and phosphorylated MAPK p38 proteins in the sRAGE treated lungs (Supplementary Fig. S3b). In line with these observations, Dear Editor Since last year, the most demanding task of the global pharmaceutical community has been focused on the development of strategies to treat coronavirus disease 2019 (COVID-19). To date, several vaccines have been developed and already demonstrated their efficacy in reducing the incidence of COVID-19. 1 However, the development of drugs treating COVID-19 is lagging far behind, and all the current treatment regimens have their limitations. 2 The lung is the major and usually the initial organ to be attacked by severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2). Therefore, putting lung inflammation under control is expected to alleviate whole-body inflammatory responses and inflammation-induced organ damage. In principle, the inflammatory responses incurred by SARS-CoV-2 involve diversified upstream stimuli, multiple signaling pathways, and numerous downstream effectors. 3 Therefore, targeting any particular cytokine or even signaling pathway may not be sufficient in alleviating the systemic overreaction of immune system in severe COVID-19, namely cytokine storm. It is urgent and pivotal to develop means which can systemically tuning down the exaggerated immune responses via a relative central node in the inflammation signaling cascades. The receptor for advanced glycation endproducts (RAGE) is a multiligand, pro-inflammatory pattern recognition receptor that is implicated in both infectious and sterile inflammatory conditions. 4 Upon ligand binding to the receptor, it transduces signals via several downstream kinases, including MAPKs, PI3K/Akt, and JAK, which in turn, activate transcription factors NF-B, AP-1, and Stat3. These transcription factors promote the expression of important cytokines, such as TNF-, IL-1, and IL-6. Remarkably, RAGE is almost exclusively expressed in the lung and involved in multiple lung diseases. Soluble RAGE (sRAGE) is a splicing variant or a post-translationally cleaved short form of RAGE which lacks the transmembrane and intracellular C-terminal domain, thus serving as a decoy receptor to attenuate inflammatory responses initiated by the full-length RAGE. 5 However, it is unknown whether RAGE signaling plays a role in SARS-CoV-2-induced pneumonia, and if so, whether sRAGE can be applied as a therapeutic agent to treat COVID-19. In order to investigate the role of RAGE/sRAGE in SARS-CoV-2induced pneumonia, we used a SARS-CoV-2-inoculated hamster model of COVID-19 ( Supplementary Fig. S1a) with RAGE highly expressed in the lung ( Supplementary Fig. S1b). For therapeutic intervention, the infected hamsters were treated with sRAGE or human serum albumin (HSA) ( Supplementary Fig. S1c), starting from day 1 post-inoculation (1dpi). SARS-CoV-2 inoculation caused severe pneumonia in the hamsters, and sRAGE treatment profoundly mitigated SARS-CoV2-induced pneumonia (Fig. 1a-c) and significantly delayed body weight loss (F = 2.363, P = 0.025) ( Supplementary Fig. S1d), although the RAGE expression and the viral RNA loads in lung tissues were not different between sRAGEand HSA-treated groups ( Supplementary Fig. S1e, f). Remarkably, sRAGE significantly reduced the diffused thickened alveolar septum, multifocal exudation, and accumulation of inflammatory cells in the perivascular and peribronchial spaces in the lung (Fig. 1a, b) and sRAGE treatment also reduced the number of hamsters with severe interstitial pneumonia (Fig. 1c). Furthermore, immunohistochemical staining demonstrated that SARS-CoV-2-induced recruitment of CD3-positive T cells and the expression of myxovirus resistance protein 1 (Mx1, also known as Mx2 in hamster) in the lung were greatly reduced by sRAGE treatment (Fig. 1d), suggesting alleviated lung inflammation induced by SARS-CoV-2. Consistently, quantitative reverse transcription polymerase chain reaction (RT-qPCR) showed that the mRNA levels of the macrophage marker CD68, and inflammatory disease markers including IFIT3 and Mx1, and inflammatory cytokines including IL-1, IL-6, TNF-, IL-18, and IL-10, as well as ICAM1, were induced by virus infection, while the induction was significantly repressed by the treatment with sRAGE ( Fig. 1e and Supplementary Fig. S1g). These results corroborate that sRAGE can effectively suppress SARS-CoV-2-triggered pneumonia. To systematically analyze the molecular pathogenesis in the lung upon SARS-CoV-2 infection and sRAGE treatment, we used stable isotope-labeled proteomics analysis (TMTpro, 16plex) to profile the whole proteome of the lung tissues. The list of the proteins identified and analyzed are shown in Supplementary data S1. The hierarchical clustering analysis separated the sRAGE-treated samples from those untreated or treated with HSA (Fig. 1f). Thus, in the following analysis, the samples from untreated or HSA-treated groups were combined into one group (labeled as "infection"), and the samples from sRAGE-treated animals were labeled as "sRAGE". Consistently, principal component analysis (PCA) also separated the samples into three groups based upon their proteomic profiles ( Supplementary Fig. S2a). Importantly, sRAGE treatment attenuated the increases of 74.7% (408 out of 546) of the SARS-CoV-2 infectionupregulated proteins (Supplementary Fig. S2b and Supplementary data S1-sheet 4). Several major pathways affected by virus infection were identified using Reactome or KEGG analysis of the 546 upregulated proteins (Supplementary Fig. S2c, d and Supplementary data S1-sheet 5, 6). Most of the upregulated proteins involved in inflammation and DNA replication upon SARS-CoV-2 infection were downregulated by sRAGE treatment according to the recovery score ( Supplementary Fig. S2e, f). The infection of SARS-CoV-2 instigated profound inflammatory responses in the lung, as evidenced by the upregulation of multiple inflammation-related proteins, while most of these changes were ameliorated by sRAGE treatment (Fig. 1g). Immunohistochemical staining in lung tissues revealed that the sRAGE treatment resulted in the downregulation of total and phosphorylated p65 transcription factors and their nuclear localization ( Supplementary Fig. S3a), as well as reduced signal intensity of total and phosphorylated MAPK p38 proteins in the sRAGE treated lungs (Supplementary Fig. S3b). In line with these observations, proteomic data confirmed that the infection-induced alterations in the NF-B and p38 signaling were repressed by sRAGE (Fig. 1h). Furthermore, the upregulation of JAK/STAT signaling components caused by SARS-CoV-2 infection was also mitigated by sRAGE treatment (Fig. 1i). In addition to these known downstreams of RAGE, the Toll-like receptor signaling cascades were also curbed by sRAGE treatment as all the protein levels of TLR7, TLR2, Myd88, IRF9, and Syk tended to decline in response to sRAGE treatment (Fig. 1j). Moreover, other elevated proteins related to inflammatory signaling, including Cdk7, Ddx58, Dock2, and Ifih, were also Letter restored by sRAGE treatment (Supplementary Fig. S3c). Taken together, the above results strongly indicate that treatment with sRAGE suppresses the virus-triggered, exaggerated inflammatory responses of multiple inflammatory signaling pathways. Importantly, SARS-CoV-2 infection-induced activation of cell cycle/ death-related pathways that was also alleviated by sRAGE treatment (Fig. 1k). Particularly, several key factors involved in inflammatory cell death, such as necroptosis-related Ripk1, Ripk3, and Mlkl, pyroptosisrelated Gsdmd, as well as apoptosis-related Fas, Caspases 8 and 3, were all decreased in response to sRAGE treatment (Fig. 1l), which should contribute to the reduced cell death in the lung ( Supplementary Fig. S4a, b). In fact, TUNEL positive cells were greatly reduced not only in the lung but also in the heart and kidney ( Supplementary Fig. S4a, b), regardless of the absence of obvious histological changes in both heart and kidney tissues (Supplementary Fig. S4c), suggesting that sRAGE treatment ameliorates the systemic tissue damage caused by SARS-CoV-2 infection. In summary, we have performed the first "proof-of-concept" study of using sRAGE to treat COVID-19 in the hamster model. The results have demonstrated that sRAGE can potently and systemically attenuate the overactivation of inflammatory responses triggered by SARS-CoV-2 infection. A combination of sRAGE with certain anti-viral drugs may provide a more effective treatment for COVID-19. Our study provides strong evidence supporting the therapeutic potential of using sRAGE in the real clinical settings. DATA AVAILABILITY All the datasets used and/or analyzed during this study are available from the corresponding author on reasonable request. Fig. 1 sRAGE alleviates SARS-CoV-2-induced pneumonia via inhibition of multiple signaling pathways involved in exaggerated inflammatory response and cell death. a Representative lung histopathological images (H&E staining) from uninfected control hamsters, HSA-or sRAGEtreated SARS-CoV-2-infected hamsters. Scale bar = 100 m. b Pathological score of the lung lesions. n = 10 in HSA-treated group, n = 15 in sRAGE-treated group. c Percentages of severe interstitial pneumonia in HSA-or sRAGE-treated SARS-CoV-2 infected hamsters. n = 10 in HSAtreated group, n = 15 in sRAGE-treated group. d Immunohistochemical staining of CD3 and Mx1 expression cells in the lung. Scale bar = 100 m. e Expression levels of CD68, IFIT3, ICAM1, IL-1, IL6, and TNF in the lung determined by RT-qPCR. n = 4 in control group, n = 10 in infection group, n = 14 in sRAGE-treated group. f Heatmap of clustered correlation matrix. The samples were clustered into three groups: C1 (consisting 4 control, 1 infected and treated with sRAGE hamsters), C2 (consisting 7 infected and sRAGE-treated, and 1 infected and HSAtreated hamsters), and C3 (consisting 5 infected and untreated, 4 infected and treated with HSA, and 2 infected and treated with sRAGE hamsters). g Heatmap showing the normalized expression of inflammation-related proteins across control, infection, and sRAGE-treated groups. (h-j) Statistical analysis of protein expression in p38/NF-B pathway (h), JAK/STAT pathway (i), and TLR pathway (j) in lung proteomics. n = 5 in control group, n = 10 in infection group, n = 10 in sRAGE-treated group. k Heatmap showing the normalized expression of cell cycle/ death-related proteins across control, infection, and sRAGE-treated groups. l Statistical analysis of necroptosis, pyroptosis, and apoptosisrelated protein expression in the lung proteomics. n = 5 in control group, n = 10 in infection group, n = 10 in sRAGE-treated group. Data are mean ± SEM. *P < 0.05, **P < 0.01, ***P < 0.001 (Student's t-test) |
Hetero-epitaxial growth and large piezoelectric effects in (001) and (111) oriented PbTiO3LaNiO3 multilayers Hetero-epitaxial thin films were obtained in PbTiO3/LaNiO3 thin films by using laser ablation for sequential deposition of each thin film. Structural variations were built to characterize the differences in local ferroelectric properties. For this study, 50 nm-thick-PTO thin films with (001) and (111) orientations were investigated. Piezoresponse force microscopy (PFM) was used to investigate the local ferroelectricity in the thin films. High piezoelectric responses were obtained in the thin films, with values of approximately 75 and 83 pm V−1 in the (001) and (111)-oriented thin films, respectively. It is notable that remarkable piezoelectric properties were obtained in the (111)-oriented thin films. The (111)-oriented thin films have the potential to be used as multi-bit memory because of the reduction of degradation in ferroelectricity and various directional piezoelectricity. We compared the functionalities of the (001)- and (111)-oriented thin films and obtained different performances in the thin films. |
/*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.apache.syncope.ext.scimv2.cxf.service;
import java.util.HashSet;
import java.util.List;
import java.util.Set;
import java.util.stream.Collectors;
import javax.ws.rs.core.Response;
import javax.ws.rs.core.Response.ResponseBuilder;
import org.apache.commons.lang3.ArrayUtils;
import org.apache.commons.lang3.StringUtils;
import org.apache.syncope.common.lib.AnyOperations;
import org.apache.syncope.common.lib.SyncopeConstants;
import org.apache.syncope.common.lib.request.MembershipUR;
import org.apache.syncope.common.lib.request.UserUR;
import org.apache.syncope.common.lib.to.EntityTO;
import org.apache.syncope.common.lib.to.GroupTO;
import org.apache.syncope.common.lib.to.ProvisioningResult;
import org.apache.syncope.common.lib.types.PatchOperation;
import org.apache.syncope.core.logic.GroupLogic;
import org.apache.syncope.core.logic.SCIMDataBinder;
import org.apache.syncope.core.logic.UserLogic;
import org.apache.syncope.core.logic.scim.SCIMConfManager;
import org.apache.syncope.core.persistence.api.dao.AnyDAO;
import org.apache.syncope.core.persistence.api.dao.GroupDAO;
import org.apache.syncope.core.persistence.api.dao.UserDAO;
import org.apache.syncope.core.persistence.api.dao.search.MembershipCond;
import org.apache.syncope.core.persistence.api.dao.search.SearchCond;
import org.apache.syncope.ext.scimv2.api.BadRequestException;
import org.apache.syncope.ext.scimv2.api.data.ListResponse;
import org.apache.syncope.ext.scimv2.api.data.SCIMGroup;
import org.apache.syncope.ext.scimv2.api.data.SCIMSearchRequest;
import org.apache.syncope.ext.scimv2.api.service.GroupService;
import org.apache.syncope.ext.scimv2.api.type.ErrorType;
import org.apache.syncope.ext.scimv2.api.type.Resource;
import org.apache.syncope.ext.scimv2.api.type.SortOrder;
public class GroupServiceImpl extends AbstractService<SCIMGroup> implements GroupService {
public GroupServiceImpl(
final UserDAO userDAO,
final GroupDAO groupDAO,
final UserLogic userLogic,
final GroupLogic groupLogic,
final SCIMDataBinder binder,
final SCIMConfManager confManager) {
super(userDAO, groupDAO, userLogic, groupLogic, binder, confManager);
}
@Override
public Response create(final SCIMGroup group) {
// first create group, no members assigned
ProvisioningResult<GroupTO> result = groupLogic.create(SCIMDataBinder.toGroupCR(group), false);
// then assign members
group.getMembers().forEach(member -> {
UserUR req = new UserUR.Builder(member.getValue()).
membership(new MembershipUR.Builder(result.getEntity().getKey()).
operation(PatchOperation.ADD_REPLACE).build()).
build();
try {
userLogic.update(req, false);
} catch (Exception e) {
LOG.error("While setting membership of {} to {}", result.getEntity().getKey(), member.getValue(), e);
}
});
return createResponse(
result.getEntity().getKey(),
binder.toSCIMGroup(
result.getEntity(),
uriInfo.getAbsolutePathBuilder().path(result.getEntity().getKey()).build().toASCIIString(),
List.of(),
List.of()));
}
@Override
public SCIMGroup get(final String id,
final String attributes,
final String excludedAttributes) {
return binder.toSCIMGroup(
groupLogic.read(id),
uriInfo.getAbsolutePathBuilder().build().toASCIIString(),
List.of(ArrayUtils.nullToEmpty(StringUtils.split(attributes, ','))),
List.of(ArrayUtils.nullToEmpty(StringUtils.split(excludedAttributes, ','))));
}
@Override
public Response update(final String id) {
return Response.status(Response.Status.NOT_IMPLEMENTED).build();
}
@Override
public Response replace(final String id, final SCIMGroup group) {
if (!id.equals(group.getId())) {
throw new BadRequestException(ErrorType.invalidPath, "Expected " + id + ", found " + group.getId());
}
ResponseBuilder builder = checkETag(Resource.Group, id);
if (builder != null) {
return builder.build();
}
// save current group members
Set<String> beforeMembers = new HashSet<>();
MembershipCond membCond = new MembershipCond();
membCond.setGroup(id);
SearchCond searchCond = SearchCond.getLeaf(membCond);
int count = userLogic.search(searchCond,
1, 1, List.of(),
SyncopeConstants.ROOT_REALM, false).getLeft();
for (int page = 1; page <= (count / AnyDAO.DEFAULT_PAGE_SIZE) + 1; page++) {
beforeMembers.addAll(userLogic.search(
searchCond,
page,
AnyDAO.DEFAULT_PAGE_SIZE,
List.of(),
SyncopeConstants.ROOT_REALM,
false).
getRight().stream().map(EntityTO::getKey).collect(Collectors.toSet()));
}
// update group, don't change members
ProvisioningResult<GroupTO> result = groupLogic.update(
AnyOperations.diff(SCIMDataBinder.toGroupTO(group), groupLogic.read(id), false), false);
// assign new members
Set<String> afterMembers = new HashSet<>();
group.getMembers().forEach(member -> {
afterMembers.add(member.getValue());
if (!beforeMembers.contains(member.getValue())) {
UserUR req = new UserUR.Builder(member.getValue()).
membership(new MembershipUR.Builder(result.getEntity().getKey()).
operation(PatchOperation.ADD_REPLACE).build()).
build();
try {
userLogic.update(req, false);
} catch (Exception e) {
LOG.error("While setting membership of {} to {}",
result.getEntity().getKey(), member.getValue(), e);
}
}
});
// remove unconfirmed members
beforeMembers.stream().filter(member -> !afterMembers.contains(member)).forEach(user -> {
UserUR req = new UserUR.Builder(user).
membership(new MembershipUR.Builder(result.getEntity().getKey()).
operation(PatchOperation.DELETE).build()).
build();
try {
userLogic.update(req, false);
} catch (Exception e) {
LOG.error("While removing membership of {} from {}", result.getEntity().getKey(), user, e);
}
});
return updateResponse(
result.getEntity().getKey(),
binder.toSCIMGroup(
result.getEntity(),
uriInfo.getAbsolutePathBuilder().path(result.getEntity().getKey()).build().toASCIIString(),
List.of(),
List.of()));
}
@Override
public Response delete(final String id) {
ResponseBuilder builder = checkETag(Resource.Group, id);
if (builder != null) {
return builder.build();
}
anyLogic(Resource.Group).delete(id, false);
return Response.noContent().build();
}
@Override
public ListResponse<SCIMGroup> search(
final String attributes,
final String excludedAttributes,
final String filter,
final String sortBy,
final SortOrder sortOrder,
final Integer startIndex,
final Integer count) {
SCIMSearchRequest request = new SCIMSearchRequest(filter, sortBy, sortOrder, startIndex, count);
if (attributes != null) {
request.getAttributes().addAll(
List.of(ArrayUtils.nullToEmpty(StringUtils.split(attributes, ','))));
}
if (excludedAttributes != null) {
request.getExcludedAttributes().addAll(
List.of(ArrayUtils.nullToEmpty(StringUtils.split(excludedAttributes, ','))));
}
return doSearch(Resource.Group, request);
}
@Override
public ListResponse<SCIMGroup> search(final SCIMSearchRequest request) {
return doSearch(Resource.Group, request);
}
}
|
def energy_produced(P, seconds):
assert isinstance(P, (pd.DataFrame, pd.Series)), 'D must be of type pd.Series'
assert isinstance(seconds, (int, float)), 'seconds must be of type int or float'
if isinstance(P, pd.DataFrame) and len(P.columns) == 1:
P = P.squeeze().copy()
H, edges = np.histogram(P, 100 )
hist_dist = _rv_histogram([H,edges])
x = np.linspace(edges.min(),edges.max(),1000)
expected_val_of_power = np.trapz(x*hist_dist.pdf(x),x=x)
E = seconds * expected_val_of_power
return E |
// Decompiled by Jad v1.5.8g. Copyright 2001 <NAME>.
// Jad home page: http://www.kpdus.com/jad.html
// Decompiler options: packimports(3) annotate safe
package com.google.android.exoplayer2.extractor;
import com.google.android.exoplayer2.util.Util;
import java.util.Arrays;
// Referenced classes of package com.google.android.exoplayer2.extractor:
// SeekMap, SeekPoint
public final class ChunkIndex
implements SeekMap
{
public ChunkIndex(int ai[], long al[], long al1[], long al2[])
{
// 0 0:aload_0
// 1 1:invokespecial #21 <Method void Object()>
sizes = ai;
// 2 4:aload_0
// 3 5:aload_1
// 4 6:putfield #23 <Field int[] sizes>
offsets = al;
// 5 9:aload_0
// 6 10:aload_2
// 7 11:putfield #25 <Field long[] offsets>
durationsUs = al1;
// 8 14:aload_0
// 9 15:aload_3
// 10 16:putfield #27 <Field long[] durationsUs>
timesUs = al2;
// 11 19:aload_0
// 12 20:aload 4
// 13 22:putfield #29 <Field long[] timesUs>
length = ai.length;
// 14 25:aload_0
// 15 26:aload_1
// 16 27:arraylength
// 17 28:putfield #31 <Field int length>
if(length > 0)
//* 18 31:aload_0
//* 19 32:getfield #31 <Field int length>
//* 20 35:ifle 61
{
durationUs = al1[length - 1] + al2[length - 1];
// 21 38:aload_0
// 22 39:aload_3
// 23 40:aload_0
// 24 41:getfield #31 <Field int length>
// 25 44:iconst_1
// 26 45:isub
// 27 46:laload
// 28 47:aload 4
// 29 49:aload_0
// 30 50:getfield #31 <Field int length>
// 31 53:iconst_1
// 32 54:isub
// 33 55:laload
// 34 56:ladd
// 35 57:putfield #33 <Field long durationUs>
return;
// 36 60:return
} else
{
durationUs = 0L;
// 37 61:aload_0
// 38 62:lconst_0
// 39 63:putfield #33 <Field long durationUs>
return;
// 40 66:return
}
}
public int getChunkIndex(long l)
{
return Util.binarySearchFloor(timesUs, l, true, true);
// 0 0:aload_0
// 1 1:getfield #29 <Field long[] timesUs>
// 2 4:lload_1
// 3 5:iconst_1
// 4 6:iconst_1
// 5 7:invokestatic #42 <Method int Util.binarySearchFloor(long[], long, boolean, boolean)>
// 6 10:ireturn
}
public long getDurationUs()
{
return durationUs;
// 0 0:aload_0
// 1 1:getfield #33 <Field long durationUs>
// 2 4:lreturn
}
public SeekMap.SeekPoints getSeekPoints(long l)
{
int i = getChunkIndex(l);
// 0 0:aload_0
// 1 1:lload_1
// 2 2:invokevirtual #48 <Method int getChunkIndex(long)>
// 3 5:istore_3
SeekPoint seekpoint = new SeekPoint(timesUs[i], offsets[i]);
// 4 6:new #50 <Class SeekPoint>
// 5 9:dup
// 6 10:aload_0
// 7 11:getfield #29 <Field long[] timesUs>
// 8 14:iload_3
// 9 15:laload
// 10 16:aload_0
// 11 17:getfield #25 <Field long[] offsets>
// 12 20:iload_3
// 13 21:laload
// 14 22:invokespecial #53 <Method void SeekPoint(long, long)>
// 15 25:astore 4
if(seekpoint.timeUs < l && i != length - 1)
//* 16 27:aload 4
//* 17 29:getfield #56 <Field long SeekPoint.timeUs>
//* 18 32:lload_1
//* 19 33:lcmp
//* 20 34:ifge 87
//* 21 37:iload_3
//* 22 38:aload_0
//* 23 39:getfield #31 <Field int length>
//* 24 42:iconst_1
//* 25 43:isub
//* 26 44:icmpne 50
//* 27 47:goto 87
{
long al[] = timesUs;
// 28 50:aload_0
// 29 51:getfield #29 <Field long[] timesUs>
// 30 54:astore 5
i++;
// 31 56:iload_3
// 32 57:iconst_1
// 33 58:iadd
// 34 59:istore_3
return new SeekMap.SeekPoints(seekpoint, new SeekPoint(al[i], offsets[i]));
// 35 60:new #58 <Class SeekMap$SeekPoints>
// 36 63:dup
// 37 64:aload 4
// 38 66:new #50 <Class SeekPoint>
// 39 69:dup
// 40 70:aload 5
// 41 72:iload_3
// 42 73:laload
// 43 74:aload_0
// 44 75:getfield #25 <Field long[] offsets>
// 45 78:iload_3
// 46 79:laload
// 47 80:invokespecial #53 <Method void SeekPoint(long, long)>
// 48 83:invokespecial #61 <Method void SeekMap$SeekPoints(SeekPoint, SeekPoint)>
// 49 86:areturn
} else
{
return new SeekMap.SeekPoints(seekpoint);
// 50 87:new #58 <Class SeekMap$SeekPoints>
// 51 90:dup
// 52 91:aload 4
// 53 93:invokespecial #64 <Method void SeekMap$SeekPoints(SeekPoint)>
// 54 96:areturn
}
}
public boolean isSeekable()
{
return true;
// 0 0:iconst_1
// 1 1:ireturn
}
public String toString()
{
StringBuilder stringbuilder = new StringBuilder();
// 0 0:new #70 <Class StringBuilder>
// 1 3:dup
// 2 4:invokespecial #71 <Method void StringBuilder()>
// 3 7:astore_1
stringbuilder.append("ChunkIndex(length=");
// 4 8:aload_1
// 5 9:ldc1 #73 <String "ChunkIndex(length=">
// 6 11:invokevirtual #77 <Method StringBuilder StringBuilder.append(String)>
// 7 14:pop
stringbuilder.append(length);
// 8 15:aload_1
// 9 16:aload_0
// 10 17:getfield #31 <Field int length>
// 11 20:invokevirtual #80 <Method StringBuilder StringBuilder.append(int)>
// 12 23:pop
stringbuilder.append(", sizes=");
// 13 24:aload_1
// 14 25:ldc1 #82 <String ", sizes=">
// 15 27:invokevirtual #77 <Method StringBuilder StringBuilder.append(String)>
// 16 30:pop
stringbuilder.append(Arrays.toString(sizes));
// 17 31:aload_1
// 18 32:aload_0
// 19 33:getfield #23 <Field int[] sizes>
// 20 36:invokestatic #87 <Method String Arrays.toString(int[])>
// 21 39:invokevirtual #77 <Method StringBuilder StringBuilder.append(String)>
// 22 42:pop
stringbuilder.append(", offsets=");
// 23 43:aload_1
// 24 44:ldc1 #89 <String ", offsets=">
// 25 46:invokevirtual #77 <Method StringBuilder StringBuilder.append(String)>
// 26 49:pop
stringbuilder.append(Arrays.toString(offsets));
// 27 50:aload_1
// 28 51:aload_0
// 29 52:getfield #25 <Field long[] offsets>
// 30 55:invokestatic #92 <Method String Arrays.toString(long[])>
// 31 58:invokevirtual #77 <Method StringBuilder StringBuilder.append(String)>
// 32 61:pop
stringbuilder.append(", timeUs=");
// 33 62:aload_1
// 34 63:ldc1 #94 <String ", timeUs=">
// 35 65:invokevirtual #77 <Method StringBuilder StringBuilder.append(String)>
// 36 68:pop
stringbuilder.append(Arrays.toString(timesUs));
// 37 69:aload_1
// 38 70:aload_0
// 39 71:getfield #29 <Field long[] timesUs>
// 40 74:invokestatic #92 <Method String Arrays.toString(long[])>
// 41 77:invokevirtual #77 <Method StringBuilder StringBuilder.append(String)>
// 42 80:pop
stringbuilder.append(", durationsUs=");
// 43 81:aload_1
// 44 82:ldc1 #96 <String ", durationsUs=">
// 45 84:invokevirtual #77 <Method StringBuilder StringBuilder.append(String)>
// 46 87:pop
stringbuilder.append(Arrays.toString(durationsUs));
// 47 88:aload_1
// 48 89:aload_0
// 49 90:getfield #27 <Field long[] durationsUs>
// 50 93:invokestatic #92 <Method String Arrays.toString(long[])>
// 51 96:invokevirtual #77 <Method StringBuilder StringBuilder.append(String)>
// 52 99:pop
stringbuilder.append(")");
// 53 100:aload_1
// 54 101:ldc1 #98 <String ")">
// 55 103:invokevirtual #77 <Method StringBuilder StringBuilder.append(String)>
// 56 106:pop
return stringbuilder.toString();
// 57 107:aload_1
// 58 108:invokevirtual #100 <Method String StringBuilder.toString()>
// 59 111:areturn
}
private final long durationUs;
public final long durationsUs[];
public final int length;
public final long offsets[];
public final int sizes[];
public final long timesUs[];
}
|
/**
* Created by swapnil on 12/2/15.
*/
@RunWith(RobolectricGradleTestRunner.class)
@Config(packageName = "com.blanyal.remindme", constants = BuildConfig.class, sdk = 21)
public class ReminderAddActivityTest {
private ReminderAddActivity activity;
private ReminderDatabase reminderDatabase;
private Reminder reminder;
private View mTestRoboActivityView;
static int count=0;
static int idForDelete;
private Context mContext;
public ReminderAddActivityTest(){
}
/*Ref:
http://antonioleiva.com/android-unit-testing-using-robolectric-introduction/
*/
@Before
public void setup() {
activity = Robolectric.buildActivity(ReminderAddActivity.class).create().get();
}
// TC ID 01
@Test
public void testCreateReminder() {
Context context = activity.getApplicationContext();
reminderDatabase = new ReminderDatabase(context);
reminder = new Reminder();
reminder.setID(2);
reminder.setTitle("testFromDa");
reminder.setDate("14/11/2015");
reminder.setTime("11:59");
reminder.setRepeat("true");
reminder.setRepeatNo("1");
reminder.setRepeatType("Hour");
reminder.setActive("true");
whenDataisSave(reminderDatabase, reminder);
}
//TC Id 4
@Test
public void testReminderTitleText() throws Exception {
EditText textView = (EditText) activity.findViewById(R.id.reminder_title);
String temp = "reminder test";
textView.setText(temp);
assertEquals(textView.getText(), temp);
}
//TC Id 05
@Test
public void testEditDateSetText() throws Exception {
TextView textView = (TextView) activity.findViewById(R.id.set_date);
textView.setText("11/14/2105");
assertEquals("11/14/2105", textView.getText());
}
//TC Id 06
@Test
public void testEditTimeSetText() throws Exception {
TextView textView = (TextView) activity.findViewById(R.id.set_time);
textView.setText("11:20");
assertEquals("11:20",textView.getText());
}
//TC Id 07
@Test
public void testEmptyDateText() throws Exception {
TextView textView = (TextView) activity.findViewById(R.id.date_text);
assertEquals(textView.getText(), "Date");
}
//TC Id 08
@Test
public void testDefaultDateSetText() throws Exception {
TextView textView = (TextView) activity.findViewById(R.id.set_date);
//ref : http://stackoverflow.com/questions/8745297/want-current-date-and-time-in-dd-mm-yyyy-hhmmss-ss-format
String pattern = "dd/MM/yyyy";
String dateInString =new SimpleDateFormat(pattern).format(new Date());
assertEquals(dateInString,textView.getText());
}
//TC Id 09
@Test
public void testEmptyTimeText() throws Exception {
TextView textView = (TextView) activity.findViewById(R.id.time_text);
assertEquals(textView.getText(), "Time");
}
//TC Id 10
@Test
public void testDefaultTimeSetText() throws Exception {
TextView textView = (TextView) activity.findViewById(R.id.set_time);
//ref : http://stackoverflow.com/questions/8745297/want-current-date-and-time-in-dd-mm-yyyy-hhmmss-ss-format
DateFormat dateFormat = new SimpleDateFormat("HH:mm");
String formattedDate = dateFormat.format(new Date()).toString();
assertEquals(formattedDate, textView.getText());
}
//TC Id 11
@Test
public void testEmptyRepeatText() throws Exception {
TextView textView = (TextView) activity.findViewById(R.id.repeat_text);
assertEquals(textView.getText(), "Repeat");
}
//TC Id 12
@Test
public void testDefaultSwitchOn() throws Exception {
TextView textView = (TextView) activity.findViewById(R.id.set_repeat);
Switch switc = (Switch) activity.findViewById(R.id.repeat_switch);
assertTrue(switc.isChecked());
}
//TC Id 13
@Test
public void testDefaultRepeatSetTextOnSwitchOn() throws Exception {
TextView textView = (TextView) activity.findViewById(R.id.set_repeat);
Switch switc = (Switch) activity.findViewById(R.id.repeat_switch);
if (switc.callOnClick())
assertEquals("Every 1 Hour(s)", textView.getText());
}
//TC Id 14
@Test
public void testDefaultRepeatSetTextOnSwitchOff() throws Exception {
Switch switc = (Switch) activity.findViewById(R.id.repeat_switch);
switc.performClick();
TextView textView = (TextView) activity.findViewById(R.id.set_repeat);
if(switc.isChecked()==false)
assertEquals("Off", textView.getText());
}
//TC Id 15
@Test
public void testEmptyRepeatNoText() throws Exception {
TextView textView = (TextView) activity.findViewById(R.id.repeat_no_text);
assertEquals(textView.getText(), "Repetition Interval");
}
//TC Id 16
@Test
public void testDefaultRepeatNoSetText() throws Exception {
TextView textView = (TextView) activity.findViewById(R.id.set_repeat_no);
assertEquals("1", textView.getText());
}
//TC Id 17
@Test
public void testEmptyTypeOfRepetitionsText() throws Exception {
TextView textView = (TextView) activity.findViewById(R.id.repeat_type_text);
assertEquals(textView.getText(), "Type of Repetitions");
}
//Tc Id 18
@Test
public void testDefaultEmptyTypeOfRepetitionsSetText() throws Exception {
TextView textView = (TextView) activity.findViewById(R.id.set_repeat_type);
assertEquals(textView.getText(), "Hour");
}
//TC Id 19
@Test
public void testEmptyDetailText() throws Exception {
TextView textView = (TextView) activity.findViewById(R.id.details);
assertEquals(textView.getText(), "Details");
}
//TC Id 20
@Test
public void testGetReminder()
{
int count=0;
Context context = activity.getApplicationContext();
reminderDatabase = new ReminderDatabase(context);
reminder = new Reminder();
reminder.setID(3);
reminder.setTitle("testFromDa");
reminder.setDate("14/11/2015");
reminder.setTime("11:59");
reminder.setRepeat("true");
reminder.setRepeatNo("1");
reminder.setRepeatType("Hour");
reminder.setActive("true");
reminderDatabase.addReminder(reminder);
whenDataisSave(reminderDatabase,reminder );
count++;
reminder = new Reminder();
reminder.setID(4);
reminder.setTitle("testFromDa4");
reminder.setDate("14/11/2015");
reminder.setTime("11:20");
reminder.setRepeat("true");
reminder.setRepeatNo("1");
reminder.setRepeatType("Hour");
reminder.setActive("true");
whenDataisSave(reminderDatabase, reminder);
count++;
List<Reminder> temp = reminderDatabase.getAllReminders();
count++;
assertEquals(temp.size(), count);
}
//TC Id 22
@Test
public void testMultipleAddReminderForSameData()
{
Context context = activity.getApplicationContext();
reminderDatabase = new ReminderDatabase(context);
reminder = new Reminder();
reminder.setID(6);
reminder.setTitle("testFromDa");
reminder.setDate("14/11/2015");
reminder.setTime("11:59");
reminder.setRepeat("true");
reminder.setRepeatNo("1");
reminder.setRepeatType("Hour");
reminder.setActive("true");
count++;
whenDataisSave(reminderDatabase, reminder);
reminder = new Reminder();
reminder.setID(6);
reminder.setTitle("testFromDa");
reminder.setDate("14/11/2015");
reminder.setTime("11:59");
reminder.setRepeat("true");
reminder.setRepeatNo("1");
reminder.setRepeatType("Hour");
reminder.setActive("true");
whenDataisSave(reminderDatabase, reminder);
List<Reminder> temp = reminderDatabase.getAllReminders();
String title[]= new String[2];
String datet[]= new String[2];
String time[]= new String[2];
String repeat[]= new String[2];
String active[] = new String[2];
int i=0;
for(Reminder r: temp)
{
title[i]=r.getTitle();
datet[i]=r.getDate();
time[i]= r.getTime();
repeat[i]= r.getRepeat();
active[i] = r.getActive();
i++;
if(i>1)
i=0;
}
assertNotEquals(title[0], title[1]);
assertNotEquals(datet[0], datet[1]);
assertNotEquals(time[0], time[1]);
assertNotEquals(repeat[0], repeat[1]);
assertNotEquals(active[0], active[1]);
}
//TC Id 21
@Test
public void testGetReminderCount()
{
Context context = activity.getApplicationContext();
reminderDatabase = new ReminderDatabase(context);
int count = 0;
reminder = new Reminder();
reminder.setID(5);
reminder.setTitle("testFromDa4");
reminder.setDate("14/11/2015");
reminder.setTime("11:20");
reminder.setRepeat("true");
reminder.setRepeatNo("1");
reminder.setRepeatType("Hour");
reminder.setActive("true");
count++;
whenDataisSave(reminderDatabase, reminder);
int countInDb = reminderDatabase.getRemindersCount();
assertEquals(countInDb,count);
}
private Reminder whenDataisSave(ReminderDatabase reminderDatabase, Reminder reminder) {
Reminder expected = new Reminder();
boolean isSave = false;
boolean isRetrive = false;
int a = 0;
idForDelete=a;
try {
a = reminderDatabase.addReminder(reminder);
isSave = true;
} catch (Exception e) {
isSave = false;
}
if (isSave) {
try {
expected = reminderDatabase.getReminder(a);
isRetrive = true;
} catch (Exception e) {
isRetrive = false;
}
count+=1;
if (isRetrive)
assertEquals(reminder.getTitle(), expected.getTitle());
else
assertEquals(reminder.getTitle(), expected.getTitle());
}
return expected;
}
} |
#include <Engine/RadiumEngine.hpp>
#include <Gui/TransformEditorWidget.hpp>
#include <QVBoxLayout>
namespace Ra {
namespace Gui {
TransformEditorWidget::TransformEditorWidget( QWidget* parent ) :
QWidget( parent ),
m_layout( new QVBoxLayout( this ) ),
m_translationEditor( nullptr ) {}
void TransformEditorWidget::updateValues() {
if ( canEdit() )
{
getTransform();
CORE_ASSERT( m_translationEditor, "No edtitor widget !" );
m_translationEditor->blockSignals( true );
m_translationEditor->setValue( m_transform.translation() );
m_translationEditor->blockSignals( false );
}
}
void TransformEditorWidget::onChangedPosition( const Core::Math::Vector3& v, uint id ) {
CORE_ASSERT( m_currentEdit.isValid(), "Nothing to edit" );
m_transform.translation() = v;
setTransform( m_transform );
}
void TransformEditorWidget::setEditable( const Engine::ItemEntry& ent ) {
delete m_translationEditor;
TransformEditor::setEditable( ent );
if ( canEdit() )
{
m_translationEditor =
new VectorEditor( 0,
QString::fromStdString( getEntryName(
Engine::RadiumEngine::getInstance(), m_currentEdit ) ),
true );
connect( m_translationEditor, &VectorEditor::valueChanged, this,
&TransformEditorWidget::onChangedPosition );
}
}
} // namespace Gui
} // namespace Ra
|
Endogenous morphine and ACTH association in neural tissues. BACKGROUND Endogenous morphine and proopiomelanocortin-derived peptide-like molecules were identified in molluscan tissues, including the nervous system, supporting their ancient phylogeny. Their presence and function in "simple" animals, demonstrates their involvement in mechanisms underlying the stress response, preceding the mammalian neuroendocrine axis. MATERIAL/METHODS Immunocytochemical analysis was used to study the localization of morphine- and adrenocorticotropic hormone (ACTH)-like material in the nervous system of Planorbarius corneus, Mytilus galloprovincialis, Lymnaea stagnalis and Viviparus ater. Acute stress experiments were performed on P. corneus and, by radioimmune assay, we quantified the expression of an ACTH-like peptide in control and stressed animals. RESULTS We demonstrate that in mollusks the presence of a morphine-like compound is differentially distributed in neuronal structures containing an ACTH-like molecule. In P. corneus, the two immunoreactivities appear to be colocalized in neuronal bodies and axonal endings, suggesting a role in neurotransmission/neuromodulation. We also found that these molecules are released in the hemolymph, suggesting neuroendocrine-immunoregulatory communication. Comparative studies on the other mollusks gave different distribution pictures of the two immunoreactivities. In P. corneus, following experimental trauma, the levels of both the messengers increase in ganglia and hemolymph at different times, which can be related to their postulated roles. CONCLUSIONS In mollusks more than in mammals, there is a diversified but close association between morphine and ACTH, both acting in a stress response possibly exerting reciprocal influences, suggesting that the relationship evolved in invertebrates and was conserved during evolution. |
Chronic administration of the somatostatin analogue SMS 201995 does not lead to endogenous antibody formation Plasma samples of seven patients with gut and pancreatic endocrine tumours who have been on longterm treatment with a longacting somatostatin analogue (SMS 201995) were investigated for endogenous antibodies to the peptide by incubation with radiolabelled SMS 201995. The duration of treatment with the somatostatin analogue was between 9 and 26 months and the dose from 100 to 300g day−1. In none of the patients could antibodies to SMS be detected. The effect of this somatostatin analogue is unlikely to be impaired by formation of endogenous antibodies, even after longterm treatment. |
a='abcdefghijklmnopqrstuvwxyz'
l=[0]*26
while True:
try:
text=raw_input()
except EOFError:
break
for i in text:
ascNum=ord(i.lower())-97
if ascNum > 25 or ascNum < 0:
continue
l[ascNum]+=1
for i in xrange(26):
print a[i] + ' : '+str(l[i]) |
<filename>src/main/java/GeneticProgrammingSymbolicEquationSolverApp.java
import core.LinearEquationGeneticProgammingSolver;
public class GeneticProgrammingSymbolicEquationSolverApp {
public static void main(String[] args) {
(new LinearEquationGeneticProgammingSolver()).solve();
}
} |
/*
* Copyright 2014 Feedzai
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package com.feedzai.commons.sql.abstraction.dml;
/**
* Represents functions that are internal to the database engine in place.
*
* @author <NAME> (<EMAIL>)
* @since 2.0.1
*/
public class InternalFunction extends Function {
/**
* Creates a new instance of {@link InternalFunction}.
*
* @param function The function.
*/
public InternalFunction(final String function) {
super(function);
}
/**
* Creates a new instance of {@link InternalFunction}.
*
* @param function The function.
* @param exp The expression.
*/
public InternalFunction(final String function, final Expression exp) {
super(function, exp);
}
@Override
public boolean isUDF() {
return false;
}
}
|
Q:
Signal Denoising Uniformly in Frequency Domain
I have a noisy sparse signal containing number of frequency components.
Is there any method to uniformly denoise this signal.
in other words, a method that estimated and eliminates the noise power across all the frequencies in the band and not only on the borders like in wavelet denoising?
The following plot is an example of what I was saying. I have 3 peaks and want to cancel the noise below these peaks.
A:
This sounds like a great opportunity to attempt to use singular spectrum analysis (SSA):
It appears you have some observed signal, $Y[n]$, which is some sort of mixture, and we take $Y[n]$ and create a Hankel matrix from it using some desired frame length M (this will effectively determine your sub-space resolution...more on that later).
Now that we have our Hankel matrix, we can either compute $R_Y$ directly and then do an eigendecomposition, or we could go with a Singular Value Decomposition of the trajectory matrix. You'll find them to essentially be equivalent if we use the left singular vectors; either method is perfectly valid.
So once we get our matrix of eigenvalues, $U$, we need to project them via a linear transform to get principal components, $P$:
$P = Y^H U$.
Now that we're here, we can use the singular values to determine amplitude of the signals themselves, and the principal components essentially determine the "parts" of the signal, i.e. $X[n]$ and $\epsilon[n]$.
Let's say we have N principal components, and we know that we only "need" the first one: from our principal components, we perform what is typically called Eigentriple grouping via the following multiplication:
$C = UP^H$ (where H denotes the Hermitian operator)
This new matrix, C, is what we call the reconstructive component(s) of $Y$. We're not quite done yet, because $C$ is the same dimension as our Hankel matrix, $Y$, so do get back to our signal, we'll need an additional step. To reconstruct the series itself, we'll perform what's called diagonal averaging. The goal here is that we'll finally map back to a single signal, let's call it $Y_out[n]$. It's in this step that we'll be able to utilize that one-to-one relationship of the Hankel to the series to extract out signals. This step is a bit long winded, so I've omitted it for brevity, but you can consult Wikipedia or any of the many papers/texts on SSA for more information.
So that's SSA in a nut-shell, and naturally there is a lot you can do with the method. So long as you have some appreciable signal to noise ratio (SNR) in the frequency domain, I think you'll find SSA works quite well for extracting out frequency-separable signals. The big issue is obviously going to be determining which principal components you care about.
The key to discerning you signals is that if you can assume they're all separable in frequency and have some non-negative SNR value in the frequency domain, SSA will provide a decomposition of these signals, and the power of the individual signals is stored in the eigen/singular values you computed earlier on in the method (you'll have M of these). Simply by inspecting the eignespectrumn, you can typically discern actual signals from noise; noise signals will typically be low power and spread across several principal components/eigenvalues, whereas signals will be contained typically by 1 or very few of the principal components/eigenvalues
So, in short, if you have some time series observation, you can use SSA to attempt to discern individual signals/trends. It's a really powerful tool, and as long as the signals are separable, you should have some reasonable success.
Now if you have several observations of the same data (let's say you had several sensors observing the same singals), you could attempt to use some classic blind source separation techniques, such as principal components analysis (PCA), independent components analysis (ICA), etc. Your question didn't seem to indicate such, but if that's the case let me know and I can include some details on those methods as well. Hope that helps! |
<gh_stars>100-1000
import {waffle} from '@nomiclabs/buidler';
import chai from 'chai';
import {deployContract, solidity} from 'ethereum-waffle';
import {utils} from 'ethers';
import fs from 'fs';
import ConstraintPolyLen256Artifact from '../artifacts/ConstraintPolyLen256.json';
import RecurrenceArtifact from '../artifacts/Recurrence.json';
import StarkDigestTestingArtifact from '../artifacts/StarkDigestTesting.json';
import {ConstraintPolyLen256} from '../typechain/ConstraintPolyLen256';
import {Recurrence} from '../typechain/Recurrence';
import {StarkDigestTesting} from '../typechain/StarkDigestTesting';
import recurrence_proofs from './recurrence_proofs.json';
const INITIAL_GAS = 100000000;
chai.use(solidity);
// tslint:disable:space-before-function-paren typedef
describe('Recurrence testing', function(this: any) {
// Disables the timeouts
this.timeout(0);
let constraint_contract: Recurrence;
let verifier_contract: StarkDigestTesting;
let constraint256Contract: ConstraintPolyLen256;
const provider = waffle.provider;
const [wallet] = provider.getWallets();
before(async () => {
constraint256Contract = (await deployContract(wallet, ConstraintPolyLen256Artifact)) as ConstraintPolyLen256;
constraint_contract = (await deployContract(wallet, RecurrenceArtifact, [
constraint256Contract.address,
])) as Recurrence;
verifier_contract = (await deployContract(wallet, StarkDigestTestingArtifact)) as StarkDigestTesting;
});
// Note - This checks the proof of work, but not the whole proof yet
it('Should validate a correct proof', async () => {
for (let i = 0; i < recurrence_proofs.length; i++) {
// We ts-ignore because it's connivent to abi encode here not in rust
// @ts-ignore
recurrence_proofs[i].public_inputs = utils.defaultAbiCoder.encode(
['uint256', 'uint64'],
[recurrence_proofs[i].public_inputs.value, recurrence_proofs[i].public_inputs.index],
);
// NOTE - Typescript has a very very hard time with the ethers js internal array types in struct encoding
// in this case it's best for the code to ignore it because this is how ethers js understands these types.
const receipt = await
(
// @ts-ignore
await verifier_contract.verify_proof(recurrence_proofs[i], constraint_contract.address, {
gasLimit: INITIAL_GAS,
})
).wait();
// Compute calldata cost
const call_data = utils.arrayify(
verifier_contract.interface.functions.verify_proof.encode([
// @ts-ignore
recurrence_proofs[i],
constraint_contract.address,
]),
);
const call_data_length = call_data.length;
const call_data_zeros = call_data.filter(byte => byte === 0).length;
const call_data_zeros_cost = call_data_zeros * 4;
const calldata_cost = (call_data_length - call_data_zeros) * 16 + call_data_zeros_cost;
// Log gas consumption
let gas_log = '';
gas_log += `ENTER transaction ${INITIAL_GAS} 0\n`;
gas_log += `ENTER calldata ${INITIAL_GAS} 0\n`;
gas_log += `ENTER calldata_zeros ${INITIAL_GAS - calldata_cost + call_data_zeros_cost} 0\n`;
gas_log += `LEAVE calldata_zeros ${INITIAL_GAS - calldata_cost} 0\n`;
gas_log += `LEAVE calldata ${INITIAL_GAS - calldata_cost} 0\n`;
let last_alloc = 0;
for (const event of receipt.events) {
if (event.event !== 'LogTrace') {
continue;
}
const direction = event.args.enter ? 'ENTER' : 'LEAVE';
const name = utils.parseBytes32String(event.args.name);
gas_log += `${direction} ${name} ${event.args.gasLeft} ${event.args.allocated}\n`;
last_alloc = event.args.allocated;
}
gas_log += `LEAVE transaction ${INITIAL_GAS - receipt.gasUsed?.toNumber()} ${last_alloc}\n`;
fs.writeFile(`gas-${i}.log`, gas_log, err => {
if (err) {
// tslint:disable:no-console
console.error(err);
}
});
// TODO - Use better logging
// tslint:disable:no-console
console.log('Proof verification gas used : ', receipt.gasUsed?.toNumber());
}
});
});
|
use crate::test;
test!(
string_format1,
r#"
@uri "https://www.google.com/search?q=\(.search)"
"#,
r#"
{"search":"what is jq?"}
"#,
r#"
"https://www.google.com/search?q=what%20is%20jq%3F"
"#
);
test!(
string_format2,
r#"
@html
"#,
r#"
"This works if x < y"
"#,
r#"
"This works if x < y"
"#
);
test!(
string_format3,
r#"
@sh "echo \(.)"
"#,
r#"
"O'Hara's Ale"
"#,
r#"
"echo 'O'\\''Hara'\\''s Ale'"
"#
);
test!(
string_format4,
r#"
@base64
"#,
r#"
"This is a message"
"#,
r#"
"VGhpcyBpcyBhIG1lc3NhZ2U="
"#
);
test!(
string_format5,
r#"
@base64d
"#,
r#"
"VGhpcyBpcyBhIG1lc3NhZ2U="
"#,
r#"
"This is a message"
"#
);
|
Future Directions for Clinical Respiratory Fungal Research There has been a growing appreciation of the importance of respiratory fungal diseases in recent years, with better understanding of their prevalence as well as their global distribution. In step with the greater awareness of these complex infections, we are currently poised to make major advances in the characterization and treatment of these fungal diseases, which in itself is largely a consequence of post-genomic technologies which have enabled rational drug development and a path towards personalized medicines. These advances are set against a backdrop of globalization and anthropogenic change, which have impacted the world-wide distribution of fungi and antifungal resistance, as well as our built environment. The current revolution in immunomodulatory therapies has led to a rapidly evolving population at-risk for respiratory fungal disease. Whilst challenges are considerable, perhaps the tools we now have to manage these infections are up to this challenge. There has been a welcome acceleration of the antifungal pipeline in recent years, with a number of new drug classes in clinical or pre-clinical development, as well as new focus on inhaled antifungal drug delivery. The post-genomic revolution has opened up metagenomic diagnostic approaches spanning host immunogenetics to the fungal mycobiome that have allowed better characterization of respiratory fungal disease endotypes. When these advances are considered together the key challenge is clear: to develop a personalized medicine framework to enable a rational therapeutic approach. Introduction Respiratory fungal diseases have risen in prominence in recent years, as a consequence of improved diagnostics, advocacy, research and greater awareness. However, our understanding of whether there has been a genuine increase in the prevalence of respiratory fungal diseases is less clear. Current advances across a range of different areas, including immunophenotyping, metagenomics, antifungal therapies and immunotherapies have opened up exciting opportunities to revolutionize our clinical approach to these complex respiratory infections. Opportunistic Respiratory Mycoses Fungal opportunism of the respiratory tract has been dominated by the aspergilli, and in particular Aspergillus fumigatus. Aspergillosis was first described in humans by Dieulafoy in the 1890s as a primary pulmonary infection, and as fungal rhinitis in 1915. The ubiquitous global nature of this saprophytic mould, with small, highly dispersible conidia which are inhaled on a daily basis, and its thermophilic nature make it ideally suited as a pulmonary pathogen. Other species within the genus have played a prominent role as respiratory mycoses. Within the broader context of allergic fungal airway disease, fungal sensitization may be mediated by thermophilic fungi such as Aspergillus spp., and Candida spp. as well as thermo-intolerant fungi such as the Cladosporium and Alternaria genera. In the immunocompromised host, a much broader range of opportunistic fungi, such as Pneumocystis jirovecii, the mucoromycotina, and Cryptococcus spp. may cause invasive infection. Fungi are also prominent causative agents of hypersensitivity pneumonitis, implicated in farmer's lung disease (Aspergillus fumigatus, Lichtheimia corymbifera), and peat moss exposure (Penicillium spp.) amongst others. Thus, fungi are remarkable in their ability to induce both invasive infections as well as allergic sensitization and hypersensitivity responses. Endemic Respiratory Mycoses Endemic respiratory mycoses are characterized as primary pathogens that can also disseminate, often in the context of immunocompromised. Histoplasmosis, an endemic mycosis of the Americas and opportunistic mycosis globally (var. duboisii), was first described as a human pathogen by Darling (Darling's disease) in 1909. Talaromycosis, an East Asian endemic mycosis due to Talaromyces marneffei (formerly Penicillium marneffei),, was first described in 1959 by Segretain, and coccidioidomycosis, due to Coccidioides immitis, a mycoses of the Americas, was first described in 1948. Blastomycosis, an American endemic mycosis due to Blastomyces dermatitidis was first described in 1914. Endemic mycosis gained increasing prominence, mainly as a consequence of their association with the AIDS epidemic, where they are major opportunistic pathogens, from the 1980s onwards. The current revolution in immunomodulatory therapies has led to new patient groups becoming at risk of endemic mycoses. Epidemiology The epidemiological drivers of respiratory fungal disease are complex and contingent on a range of factors, many of which may have an anthropogenic basis. For instance, global warming and the global trade in plants may have rapidly accelerated the propensity for new and ecologically invasive species, as well as the rapid global emergence of triazole resistance, recently characterized in A. fumigatus. Changes in building construction and in particular ventilation may have led to fundamental shifts in the composition and diversity of the aerial mycobiota in the built environment. Susceptibility to fungal infection is increasingly complex, where novel agents such as ibrutinib and IL-5 modulators have been shown to have potential to predispose individuals to respiratory fungal infection. The widespread use of steroids is thought to have played a major role in the apparent increasing incidence of chronic respiratory fungal diseases in the context of chronic diseases of the lung. How Can We Use Available and Emerging Antifungals to Improve Outcomes from Respiratory Fungal Disease? Much of our current understanding around the optimal use of antifungals for respiratory fungal diseases has been driven by well-funded, commercial, randomized controlled studies in the context of invasive pulmonary aspergillosis in the immunocompromised host. This has led to the licensing of voriconazole, ambisome, posaconazole and isavuconazole in this setting. However, our understanding of how these agents can be used in the context of chronic respiratory fungal disease is less well defined. Most clinical trial data are centred around itraconazole, a historical azole with poor oral absorption, major drug interactions and significant side effects. Furthermore, historical studies in invasive pulmonary aspergillosis have already established that itraconazole has poor efficacy for invasive aspergillosis, and therefore likely to be inferior to newer mould-active triazoles. It is therefore not surprising that studies of itraconazole therapy in the context of chronic pulmonary aspergillosis, allergic bronchopulmonary aspergillosis and severe asthma with fungal sensitization have by and large showed modest or no effect. In addition, the importance of triazole therapeutic drug monitoring, which has not been addressed in clinical trials, but is now de facto standard of care in the real world, is a major confounder for these studies that requires further exploration. It is notable that those studies that failed did not undertake therapeutic drug monitoring, which is not currently a requirement under the licensing of any triazole antifungal. The only study to address voriconazole in the context of allergic fungal airway disease, EVITA3, also did not involve therapeutic drug monitoring. Furthermore, whilst therapy was stopped at 3 months, clinical endpoint measurements were undertaken at 12 months, on the basis that any antifungal effect would be long-lived. However, unlike itraconazole, voriconazole is aquaphilic and does not persist in tissues in the long-term. Thus, further studies of newer triazoles such as voriconazole, posaconazole and isavuconazole in the context of chronic respiratory fungal disease are urgently needed. There is currently a lack of clarity around the use of antifungals for allergic fungal airway diseases such as severe asthma with fungal sensitization and allergic bronchopulmonary aspergillosis, where historically it was thought that these fungal diseases are purely driven by sensitization to the aerial mycobiota rather than any element of airway infection. This theory seems reasonable in the context of thermointolerant fungi such as Alternaria spp.; however for allergic bronchopulmonary aspergillosis the presence of hyphae in the mucous and evidence of mucosal inflammation with a positive Aspergillus IgG response in serum are suggestive of airway mycosis. Moreover, recent careful mycological studies suggest the involvement of airway mycosis in allergic airway disease could be much more extensive than is currently believed. Better understanding of these relationships and the role that antifungals could play are urgently needed. We are currently in the midst of a revolution in the antifungal armamentorium, with posaconazole and isavuconazole representing a step change for triazole usage clinically, as a consequence of their improved side effect profiles, absorbance and spectrum of action [34,. A number of exciting new drug classes are now in late phase clinical studies, and there has been renewed interest in inhaled antifungal development. Olorofim (F901318; F2G Ltd., Manchester, UK) is the first of a new class of drugs, the orotomides, that inhibit dihydroorotate dehydrogenase, a key enzyme in pyrimidine synthesis. The drug is available orally and intravenously with wide tissue distribution. Notably the drug has a wide spectrum of activity against Aspergillus spp., including triazole-resistant strains, as well as more difficult to treat opportunistic pulmonary fungal pathogens such as Lomentospora prolificans and Scedosporium spp. as well as the causative agents of endemic mycoses. However, there is a lack of activity against Candida spp., mucoralean fungi, and Cryptococcus spp. There is an open label study ongoing to evaluate the utility of olorofim in individuals with limited treatment options (FORMULA; NTC03583164). Fosmanogepix (APX001; Amplyx, San Diego, Ca.), is a prodrug metabolized to manogepix, its active form. It disrupts glycosylphosphatidylinositol (GPI)-anchor biosynthesis by inhibiting the enzyme Gwt1 and has good activity in vitro against Aspergillus spp., Cryptococcus neoformans, Scedosporium spp., and Fusarium spp.. There is currently a phase 2, multicentre study to evaluate Fosmanogepix for the treatment of invasive fungal infections caused by Aspergillus spp. or rare moulds (e.g. Scedosporium spp., Fusarium spp., and mucoralean fungi Inhaled antifungals represent a particularly interesting area for development in the context of respiratory fungal disease, where there is potential to achieve increased concentrations of drug in the respiratory mucosa compared to the systemic route, and the possibility for synergies with systemic agents. Historically amphotericin B has been used both as deoxycholate as well as in lipid forms, primarily as nebulized prophylaxis against pulmonary mould infection in haematological malignancies, and lung transplantation. Another setting is as therapy for allergic bronchopulmonary aspergillosis or Aspergillus tracheobronchitis. There have been case reports around the use of nebulized triazoles for the treatment of airway mycoses with varying success. More recently there has been a concerted effort to systematically develop specifically formulated nebulized antifungals with good airway distribution and retention. PC945 (Pulmocide Ltd.) is a novel triazole specifically designed to achieve high concentrations in the airway mucosa with limited systemic exposure. It has potent activity against Aspergillus and accumulates in the lung on repeat dosing. Interestingly animal studies indicate improved efficacy when combined with systemic antifungals in murine pulmonary aspergillosis. It was well tolerated in healthy individuals and asthmatics. Initial case reports for nebulized PC945 as salvage therapy in refractory lung transplant Aspergillus tracheobronchitis showed complete response. Phase 2 study data in asthma and cystic fibrosis patients with pulmonary aspergillosis are currently being evaluated. Pulmazole (Pulmatrix Inc) is a new dry powder itraconazole formulation that was being evaluated in adult asthmatics with allergic bronchopulmonary aspergillosis in Phase 2 studies (NCT03960606). Pulmazole is engineered using propriety technology that allows particles to be formulated as small, dense and dispersible particles for deep lung penetration. This allows for delivery as a dry powder by inhalation. There are also two formulations of voriconazole in development for inhalation, ZP-059 (Zambon Company S.P.A., Milano, Italy), and TFF-VORI (TFF pharmaceuticals, Austin, TX) that have completed Phase 1 of development. Taken together, the likely availability of novel systemic antifungal drug classes as well as the option for inhalational antifungals has the potential to dramatically change the clinical landscape for therapeutic options for respiratory fungal diseases. A particularly exciting challenge will be to work out if combination therapies are superior and in particular whether the combination of systemic and inhaled antifungals is superior to conventional systemic therapies that are currently prevalent. This is particularly important in the context of airway mycoses where it seems likely that current systemic antifungal treatments are sup-optimal. Another unanswered question is around what utility adjunctive inhaled antifungals could have in the context of invasive pulmonary mycoses such as invasive aspergillosis. Finally, there are major unanswered questions around duration of therapy, with most trials of invasive aspergillosis using 6-12 weeks therapy but very little data to guide where shorter courses may be appropriate. How Can We Improve the Diagnosis of Respiratory Fungal Infection? Current diagnosis of respiratory fungal diseases revolves around three central pillars, the mycological evidence of infection, the clinical status of the host and the evidence that there is an immune response to a fungus in the host. Detection of Fungal Infection Classically mycological criteria have typically been from fungal cultures from either the airway or on tissue biopsy; however these would not necessarily be diagnostic on their own (unless for an endemic mycoses) as the airway has always been considered non-sterile from a microbiological perspective and fungi are ubiquitous components of the aerial microbiota. Fungal polymerase chain reaction has been available for several decades, has long been established for the diagnosis of Pneumocystis pneumonia and has recently been approved for the diagnosis of invasive pulmonary aspergillosis. There has been limited work on the utility of fungal species multiplex PCRs, which would be highly attractive for airway samples across a range of settings such as haematological immunocompromised, lung transplantation and cystic fibrosis where a specific, limited group of fungal pathogens account for the vast majority of infections. Further progress has been made with respect to more systematic use of both b-1,3 glucan and galactomannan as markers of respiratory fungal disease where b-1,3 glucan is used in serum primarily as a screening assay and galactomannan has utility both in serum and airway samples for specific diagnosis of aspergillosis. There has been significant advance in the availability of lateral flow device assays for point-of-care testing across aspergillosis as well as endemic mycoses such as histoplasmosis. A further area of ongoing development is whether urinary antigens have utility for the diagnosis of respiratory mycoses. In general terms, it seems clear that the combination of two different assays such as PCR and antigen for identification of fungal disease leads to a much more robust diagnostic performance profile. The elephant in the room in terms of mycological diagnosis of respiratory mycoses is the airway mycobiome. There has been a significant body of work to describe this across a range of settings from the immunocompromised host, where it has been shown the mycobiome or even the microbiome could predict pulmonary fungal disease. However, this work is still at a very early phase. Further detailed studies have been undertaken in chronic respiratory fungal diseases as well as in chronic respiratory disease more generally, to determine either the role that fungi play in the pathogenesis of diseases such as asthma, COPD and bronchiectasis, or to delineate the composition of the mycobiome and microbiome underlying conditions such as allergic bronchopulmonary aspergillosis or cystic fibrosisrelated aspergillosis. Bigger multicentre studies using the latest metagenomic approaches are required, encompassing the interface between the virome, bacteriome and mycobiome. By extension, for fungal infections there is a critical question around the broader environment and in particular the aerial mycobiome as a driver for fungal infection. This has high relevance whether it be in the context of the neutropenic host with acute myeloid leukaemia, where acquisition of A. fumigatus from the hospital environment has been clearly documented to cause infection, or allergic fungal airway diseases where further work is required to understand the relationship between population-level sensitization to allergenic fungi and exposure to these fungi in the environment. Groundbreaking metagenomic approaches have been developed to characterize the aerial mycobiome that hold great promise to allow better understanding as well as prediction of the environmental factors driving respiratory fungal disease. Within the context of the immunocompromised host a key issue is around how to use diagnostics to guide treatment. In this regard, the primary questions revolve around whether universal prophylaxis (for instance posaconazole in neutropenic acute myeloid leukaemia) versus pre-emptive mycological biomarker-driven therapy or directed therapy for confirmed respiratory fungal disease is most appropriate. Whilst universal prophylaxis appears an expedient solution the rising incidence of fungal resistance to antimicrobials argues against such an approach. In contrast, directed therapy, which is useful to minimize unnecessary antifungal usage, runs the risk of late diagnosis and consequently poorer outcomes. However, these approaches have rarely been systematically compared in randomized controlled trials. The emergence of antifungal resistance as a major clinical issue is an area of great concern. This has been a huge problem for Candida spp. with replacement of C. albicans as the dominant pathogen with other species such as C. glabrata and C. krusei in highrisk azole-exposed populations, as well as the global emergence of C. auris as a high-transmission, multidrug-resistant human opportunistic pathogen. More recently our understanding of emergent triazole resistance in A. fumigatus as a consequence of both in-host adaptation to selective drug pressure as well as fungicide use in the environment, and the observation of in-host evolution of fluconazole resistance in C. neoformans means that we urgently need clinically robust molecular and phenotypic diagnostics to define the epidemiology and extent of resistance in clinical settings for all three major human fungal pathogens. Significant progress has been made with population-level next generation sequencing; integration of these approaches into clinical laboratory workflows as soon as possible is now a major goal. Immunodiagnosis of Fungal Inflammation Immunodiagnosis is a cornerstone for the diagnosis of respiratory fungal disease such as chronic pulmonary aspergillosis, allergic bronchopulmonary aspergillosis and severe asthma with fungal sensitization, as well as having utility for the identification of fungal drivers of hypersensitivity pneumonitis. The primary modality is by detection of antibody responses to immunodominant fungal antigens and allergens. However, the field is vastly complex, because fungi have one of the highest numbers of antigenic molecules when compared to other allergens. In the context of aspergillosis this is of particular interest because whilst some antigens are immunodominant and therefore have utility as components of sensitive screening assays, other antigens may have higher predictive value for disease severity and progression. For instance there are currently 23 WHO-defined fungal allergens for A. fumigatus (http://allergen.org). There has been some recent progress in the development of multiplex assays that can identify broad antigen repertoires for these diseases. Such an approach is likely to have utility for precision diagnostic medicine across chronic respiratory fungal infection, allergic fungal airway disease and hypersensitivity pneumonitis. Further progress has been made through the development of cellular response assays with a particular area of focus being fungal-reactive T cells. These have been shown to have utility as assays to identify active infection both in the context of invasive aspergillosis as well as cystic fibrosis-related aspergillosis. Large multicentre studies are required to further validate their utility for the early and accurate identification of individuals with respiratory fungal diseases. Such T cell response assays ought to be applicable to a wider range of respiratory fungal pathogens; utility has been shown already for Aspergillus spp. as well as Mucor spp. Basophil activation assays, which are used to confirm the functional ability of allergens to induce effector cell degranulation, have also been shown to be useful to assess the response of patients with allergic fungal airway disease to immunotherapies such as the IgEdepleting monoclonal omalizumab. Identification of Fungal Immunogenetic Risk Immunogenetic risk prediction for respiratory fungal disease has great promise as a key diagnostic tool in the at-risk host. Important work in this area has been undertaken within the context of primary immunodeficiencies, for instance Dectin-1/Card-9/JakStat mutations and risk of chronic mucocutaneous candidiasis. However, there has been tremendous progress in identifying immunogenetic alleles that confer risk for invasive fungal disease in the context of transplantation, where it has been shown that either donor or recipient alleles may be implicated. In the context of haematological stem cell transplant, current data suggest that where the recipient immunogenotype is predictive of risk, this is due to defects of the recipient respiratory epithelial deficiencies, whereas where the donor immunogenotype is predictive, this maps to the donor stem cell myeloid compartment. Extension of such approaches to other at-risk groups for respiratory fungal diseases would be of great interest, with some studies already undertaken for allergic bronchopulmonary aspergillosis for instance. Furthermore, most immunogenetic studies thus far have been focused on selected groups of alleles. Large-scale and multinational GWAS studies would give greater resolution to which alleles are dominant and whether they penetrate in all populations. Personalized Medicine to Identify Clinically Relevant Endotypes The identification of specific clinically relevant disease endotypes has been greatly accelerated in respiratory medicine through the advent of systematic immunophenotyping studies [95,. Given the current revolution in monoclonal antibody therapies in clinical medicine, with many agents now licensed for therapy of asthma and allergic rhinitis, repurposing of these monoclonals for allergic fungal airway diseases in particular, and better understanding of which may be more efficacious in this setting, is a current high priority. We are already using omalizumab routinely for IgE depletion in allergic bronchopulmonary aspergillosis and severe asthma with fungal sensitization, with accumulating but mainly case series-based clinical data to support this approach. Mepolizumab-based targeting of eosinophilic responses is also now commonplace, although there are some safety concerns more generally as eosinophils may have a protective role against invasive aspergillosis. Better understanding is therefore required of which immune pathways are most contributory to progressive decline in allergic fungal airway diseases in order to enable the precise targeting of monoclonal therapeutics. In order to do this, large multicentre prospective immunoprofiling studies are required adopting a systematic approach to identify those targetable immune pathways that are most predictive of severe endotypes of allergic fungal airway disease. Optimal resolution of endotypes is achieved when systemic immune signatures (i.e. peripheral blood multiparameter flow sorting), local airway signatures (i.e. sputum transcriptome, sputum mycobiome) data and clinical data (i.e. radiological scoring, respiratory physiology and clinical questionnaires) are integrated to provide multidimensional data linked to clinical longitudinal outcomes. Such an approach has been used to identify new asthma endotypes and would be of great utility for chronic respiratory fungal diseases, where we still do not fully understand for instance why only some patients with allergic bronchopulmonary aspergillosis respond to steroids, whereas others may show a response to antifungals. The emergence of artificial intelligence, or at least machine learning, has opened the door for novel approaches to identification of respiratory disease radiological endotypes, where a range of fungal-specific or at least predictive features such as nodules or cavitation exist, but could be further refined and automatically identified using non-partisan image analysis approaches. Conclusions Over the last decade there has been greater awareness of the prevalence of respiratory fungal diseases globally. The advent of the 4th industrial revolution leaves us poised to exploit molecular engineering and post-genomic technologies and advances in combination with big data science and artificial intelligence to revolution our understanding of the pathogenesis of these complex infections. Substantial progress in drug discovery and development, as well as the rational design of novel immunotherapeutics, has greatly broadened the therapeutic armamentarium with which to combat respiratory fungal disease. The major challenge we face to translate these advances for the benefit of patients will be to ensure that these advances in the understanding of disease pathogenesis and novel therapeutic options are integrated within a personalized medicine framework to ensure the right patient gets the right treatment at the right time. Arylamidine T-2307 Selectively disrupts yeast mitochondrial function by inhibiting respiratory chain complexes. Antimicrob Agents Chemother. 2019;63. |
Glucose uptake and pulsatile insulin infusion: euglycaemic clamp and glucose studies in healthy subjects. To test the hypothesis that insulin has a greater effect on glucose metabolism when given as pulsatile than as continuous infusion, a 354-min euglycaemic clamp study was carried out in 8 healthy subjects. At random order soluble insulin was given intravenously either at a constant rate of 0.45 mU/kg X min or in identical amounts in pulses of 1 1/2 to 2 1/4 min followed by intervals of 10 1/2 to 9 3/4 min. Average serum insulin levels were similar during the two infusion protocols, but pulsatile administration induced oscillations ranging between 15 and 62 microU/ml. Glucose uptake expressed as metabolic clearance rate (MCR) for glucose was significantly increased during pulsatile insulin delivery as compared with continuous administration (270-294 min: 8.7 +/- 0.7 vs 6.8 +/- 0.9 ml/kg X min, P less than 0.01, and 330-354 min: 8.9 +/- 0.5 vs 7.4 +/- 0.9 ml/kg X min, P less than 0.05). The superior efficacy of pulsatile insulin delivery on glucose uptake was not consistently found until after 210 min of insulin administration. In both infusion protocols, endogenous glucose production as estimated by the glucose infusion technique was suppressed to insignificant values. Finally, the effect of insulin on endogenous insulin secretion and lipolysis as assessed by changes in serum C-peptide and serum FFA was uninfluenced by the infusion mode. In conclusion, insulin infusion resulting in physiological serum insulin levels enhances glucose uptake in peripheral tissues in healthy subjects to a higher degree when given in a pulsed pattern mimicking that of the normal endocrine pancreas than when given as a continuous infusion. |
<filename>graphics/SoRenderArea.cpp
//==============================================================================
// Project : Grape
// Module : Graphics
// File : SoRenderArea.cpp
//
// Note : This file is a derivative work of 'SoViewer' library available at
// http://code.google.com/p/openinventorviewer/.
//
// Copyright (c) 2012, <NAME>
// All rights reserved.
//
// Redistribution and use in source and binary forms, with or without
// modification, are permitted provided that the following conditions are met:
// * Redistributions of source code must retain the above copyright
// notice, this list of conditions and the following disclaimer.
// * Redistributions in binary form must reproduce the above copyright
// notice, this list of conditions and the following disclaimer in the
// documentation and/or other materials provided with the distribution.
// * Neither the name of the copyright holder nor the names of its
// contributors may be used to endorse or promote products derived from
// this software without specific prior written permission.
//
// THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
// AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
// IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
// ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR ITS CONTRIBUTORS
// BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
// CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
// SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
// INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
// CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
// ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF
// THE POSSIBILITY OF SUCH DAMAGE.
//==============================================================================
#include "SoRenderArea.h"
#include <Inventor/SoDB.h>
#include <Inventor/nodekits/SoNodeKit.h>
#include <Inventor/SoInteraction.h>
#include <Inventor/elements/SoGLCacheContextElement.h>
namespace grape
{
SO_EVENT_SOURCE(SoWheelEvent);
//==============================================================================
void SoRenderArea::init()
//==============================================================================
{
SoDB::init();
SoNodeKit::init();
SoInteraction::init();
}
//------------------------------------------------------------------------------
SoRenderArea::SoRenderArea()
//------------------------------------------------------------------------------
{
SoRenderArea::init();
_pWheelEvent = new SoWheelEvent;
_pKeyboardEvent = new SoKeyboardEvent;
_pMouseButtonEvent = new SoMouseButtonEvent;
_pLocation2Event = new SoLocation2Event;
_pEventManager = new SoEventManager;
_pRenderManager = new SoRenderManager;
_pRenderManager->getGLRenderAction()->setCacheContext( SoGLCacheContextElement::getUniqueCacheContext() );
_pRenderManager->setBackgroundColor(SbColor4f(0.0f, 0.0f, 0.0f, 1.0f));
_pRenderManager->setRenderCallback(renderCallback, this); // turns on auto-update if callback is valid
}
//------------------------------------------------------------------------------
SoRenderArea::~SoRenderArea()
//------------------------------------------------------------------------------
{
delete _pRenderManager;
delete _pEventManager;
delete _pLocation2Event;
delete _pMouseButtonEvent;
delete _pKeyboardEvent;
delete _pWheelEvent;
}
//------------------------------------------------------------------------------
void SoRenderArea::setSceneGraph(SoNode *pScene)
//------------------------------------------------------------------------------
{
_pRenderManager->deactivate();
_pRenderManager->setSceneGraph( pScene );
_pEventManager->setSceneGraph( pScene );
_pRenderManager->activate();
_pRenderManager->scheduleRedraw();
}
//------------------------------------------------------------------------------
void SoRenderArea::setViewportRegion(const SbViewportRegion& region)
//------------------------------------------------------------------------------
{
getGLRenderAction()->setViewportRegion(region);
_pEventManager->setViewportRegion( region );
}
//------------------------------------------------------------------------------
void SoRenderArea::setAutoRedraw(SbBool enable)
//------------------------------------------------------------------------------
{
if( enable )
{
_pRenderManager->setRenderCallback(renderCallback, this);
}
else
{
_pRenderManager->setRenderCallback(NULL, NULL);
}
}
//------------------------------------------------------------------------------
void SoRenderArea::soKeyPressEvent( SoKeyboardEvent *pEvent)
//------------------------------------------------------------------------------
{
_pEventManager->processEvent( pEvent );
}
//------------------------------------------------------------------------------
void SoRenderArea::soKeyReleaseEvent( SoKeyboardEvent *pEvent)
//------------------------------------------------------------------------------
{
_pEventManager->processEvent( pEvent );
}
//------------------------------------------------------------------------------
void SoRenderArea::soMouseMoveEvent( SoLocation2Event *pEvent )
//------------------------------------------------------------------------------
{
_pEventManager->processEvent( pEvent );
}
//------------------------------------------------------------------------------
void SoRenderArea::soMousePressEvent( SoMouseButtonEvent *pEvent )
//------------------------------------------------------------------------------
{
_pEventManager->processEvent( pEvent );
}
//------------------------------------------------------------------------------
void SoRenderArea::soMouseReleaseEvent( SoMouseButtonEvent *pEvent )
//------------------------------------------------------------------------------
{
_pEventManager->processEvent( pEvent );
}
//------------------------------------------------------------------------------
void SoRenderArea::soWheelEvent( SoWheelEvent *pEvent )
//------------------------------------------------------------------------------
{
_pEventManager->processEvent( pEvent );
}
//------------------------------------------------------------------------------
void SoRenderArea::soResizeEvent(int width, int height)
//------------------------------------------------------------------------------
{
SbVec2s size((short)width, (short) height);
SbViewportRegion region(size);
_pRenderManager->setWindowSize(size);
_pRenderManager->setSize(size);
_pEventManager->setSize(size);
_pRenderManager->setViewportRegion(region);
_pEventManager->setViewportRegion(region);
_pRenderManager->scheduleRedraw();
}
//------------------------------------------------------------------------------
void SoRenderArea::soPaintEvent()
//------------------------------------------------------------------------------
{
_pRenderManager->render(/*clearWindow*/ 1, /*clearZBuffer*/ 1);
}
//------------------------------------------------------------------------------
void SoRenderArea::renderCallback(void *pUserData, SoRenderManager*)
//------------------------------------------------------------------------------
{
((SoRenderArea*)pUserData)->soRenderCallback();
}
} // grape
|
Three-Phase Induction Motor Short Circuits Fault Diagnosis using MCSA and NSC In this paper, a comparative study has been performed between two methods used to detect short circuits in stator winding. The first approach is based on the motor current signature analysis (MCSA), while the second approach is based on the analysis of the negative sequence component (NSC). The study showed the advantages and disadvantages of each method, and it also showed the change in the fault indicator for each method under different operating loads and in the presence of short circuit in the stator winding. Experimental results proved that MCSA is more efficient to detect short circuits in the stator winding. |
<gh_stars>0
import os
import unittest
from contextlib import redirect_stdout
from datetime import datetime
from io import StringIO
from unittest.mock import patch
from pepys_import.resolvers.command_line_resolver import CommandLineResolver
from pepys_import.core.store.data_store import DataStore
DIR_PATH = os.path.dirname(__file__)
class PrivacyTestCase(unittest.TestCase):
def setUp(self) -> None:
self.resolver = CommandLineResolver()
self.store = DataStore(
"",
"",
"",
0,
":memory:",
db_type="sqlite",
missing_data_resolver=self.resolver,
)
self.store.initialise()
with self.store.session_scope():
self.change_id = self.store.add_to_changes(
"TEST", datetime.utcnow(), "TEST"
).change_id
@patch("pepys_import.resolvers.command_line_resolver.create_menu")
def test_fuzzy_search_add_new_privacy(self, menu_prompt):
"""Test whether a new Privacy entity created or not
after searched and not founded in the Privacy Table."""
# Select "Search an existing classification"->Search "PRIVACY-TEST"->
# Select "Yes"
menu_prompt.side_effect = ["1", "PRIVACY-TEST", "1"]
with self.store.session_scope():
privacy = self.resolver.resolve_privacy(self.store, self.change_id)
self.assertEqual(privacy.name, "PRIVACY-TEST")
@patch("pepys_import.resolvers.command_line_resolver.create_menu")
def test_fuzzy_search_select_existing_privacy(self, menu_prompt):
"""Test whether an existing Privacy entity searched and returned or not"""
# Select "Search an existing classification"->Search "PRIVACY-TEST"
menu_prompt.side_effect = ["1", "PRIVACY-TEST"]
with self.store.session_scope():
self.store.add_to_privacies("PRIVACY-TEST", self.change_id)
privacy = self.resolver.resolve_privacy(self.store, self.change_id)
self.assertEqual(privacy.name, "PRIVACY-TEST")
@patch("pepys_import.resolvers.command_line_resolver.create_menu")
@patch("pepys_import.resolvers.command_line_resolver.prompt")
def test_fuzzy_search_select_existing_privacy_without_search(
self, resolver_prompt, menu_prompt
):
"""Test whether a new Privacy entity created or not"""
# Select "Add a new classification"->Type "PRIVACY-TEST"
menu_prompt.side_effect = ["2"]
resolver_prompt.side_effect = ["PRIVACY-TEST"]
with self.store.session_scope():
self.store.add_to_privacies("PRIVACY-TEST", self.change_id)
privacy = self.resolver.resolve_privacy(self.store, self.change_id)
self.assertEqual(privacy.name, "PRIVACY-TEST")
@patch("pepys_import.resolvers.command_line_resolver.create_menu")
def test_fuzzy_search_recursive_privacy(self, menu_prompt):
"""Test whether recursive call works for privacy"""
# Select "Search an existing classification"->Search "PRIVACY-TEST"->Select "No"
# ->Search "PRIVACY-1"
menu_prompt.side_effect = ["1", "PRIVACY-TEST", "2", "PRIVACY-1"]
with self.store.session_scope():
self.store.add_to_privacies("PRIVACY-1", self.change_id)
self.store.add_to_privacies("PRIVACY-2", self.change_id)
privacy = self.resolver.resolve_privacy(self.store, self.change_id)
self.assertEqual(privacy.name, "PRIVACY-1")
@patch("pepys_import.resolvers.command_line_resolver.create_menu")
def test_cancelling_fuzzy_search_privacy(self, menu_prompt):
"""Test whether "." returns to the resolver privacy"""
# Search "TEST"->Select "."->Select "."
menu_prompt.side_effect = ["TEST", ".", "."]
with self.store.session_scope():
privacy = self.resolver.fuzzy_search_privacy(self.store, self.change_id)
self.assertIsNone(privacy)
@patch("pepys_import.resolvers.command_line_resolver.create_menu")
def test_cancelling_resolver_privacy(self, menu_prompt):
"""Test whether "." cancels the resolve privacy and returns None"""
menu_prompt.side_effect = ["."]
with self.store.session_scope():
# Select "."
privacy = self.resolver.resolve_privacy(self.store, self.change_id)
self.assertIsNone(privacy)
class NationalityTestCase(unittest.TestCase):
def setUp(self) -> None:
self.resolver = CommandLineResolver()
self.store = DataStore(
"",
"",
"",
0,
":memory:",
db_type="sqlite",
missing_data_resolver=self.resolver,
)
self.store.initialise()
with self.store.session_scope():
self.change_id = self.store.add_to_changes(
"TEST", datetime.utcnow(), "TEST"
).change_id
@patch("pepys_import.resolvers.command_line_resolver.create_menu")
def test_fuzzy_search_add_new_nationality(self, menu_prompt):
"""Test whether a new nationality is added or not"""
# Type "TEST"->Select "Yes"
menu_prompt.side_effect = ["TEST", "1"]
with self.store.session_scope():
nationality = self.resolver.fuzzy_search_nationality(
self.store, "PLATFORM-1", self.change_id
)
self.assertEqual(nationality.name, "TEST")
@patch("pepys_import.resolvers.command_line_resolver.create_menu")
def test_fuzzy_search_nationality_recursive(self, menu_prompt):
"""Test whether recursive call works for Nationality"""
# Type "TEST"->Select "No, I'd like to select a nationality"->Type "UK"
menu_prompt.side_effect = ["TEST", "2", "UK"]
with self.store.session_scope():
self.store.add_to_nationalities("UK", self.change_id)
self.store.add_to_nationalities("USA", self.change_id)
nationality = self.resolver.fuzzy_search_nationality(
self.store, "PLATFORM-1", self.change_id
)
self.assertEqual(nationality.name, "UK")
@patch("pepys_import.resolvers.command_line_resolver.create_menu")
def test_cancelling_fuzzy_search_nationality(self, menu_prompt):
"""Test whether "." returns to the resolve nationality """
menu_prompt.side_effect = [".", ".", "TEST", ".", "."]
with self.store.session_scope():
temp_output = StringIO()
# Select "."->Select "."
with redirect_stdout(temp_output):
nationality = self.resolver.fuzzy_search_nationality(
self.store, "PLATFORM-1", self.change_id
)
output = temp_output.getvalue()
self.assertIn("Returning to the previous menu", output)
self.assertIsNone(nationality)
# Search "TEST"->Select "."->Select "."
with redirect_stdout(temp_output):
nationality = self.resolver.fuzzy_search_nationality(
self.store, "PLATFORM-1", self.change_id
)
output = temp_output.getvalue()
self.assertIn("Returning to the previous menu", output)
self.assertIsNone(nationality)
@patch("pepys_import.resolvers.command_line_resolver.create_menu")
def test_cancelling_resolve_nationality(self, menu_prompt):
"""Test whether "." cancels the resolve nationality and returns None"""
menu_prompt.side_effect = ["."]
with self.store.session_scope():
# Select "."
nationality = self.resolver.resolve_nationality(
self.store, "", self.change_id
)
self.assertIsNone(nationality)
class PlatformTypeTestCase(unittest.TestCase):
def setUp(self) -> None:
self.resolver = CommandLineResolver()
self.store = DataStore(
"",
"",
"",
0,
":memory:",
db_type="sqlite",
missing_data_resolver=self.resolver,
)
self.store.initialise()
with self.store.session_scope():
self.change_id = self.store.add_to_changes(
"TEST", datetime.utcnow(), "TEST"
).change_id
@patch("pepys_import.resolvers.command_line_resolver.create_menu")
def test_fuzzy_search_add_new_platform_type(self, menu_prompt):
"""Test whether a new Platform Type is added or not"""
# Type "TEST"->Select "Yes"
menu_prompt.side_effect = ["TEST", "1"]
with self.store.session_scope():
platform_type = self.resolver.fuzzy_search_platform_type(
self.store, "PLATFORM-1", self.change_id
)
self.assertEqual(platform_type.name, "TEST")
@patch("pepys_import.resolvers.command_line_resolver.create_menu")
def test_fuzzy_search_platform_type_recursive(self, menu_prompt):
"""Test whether recursive call works for Platform Type"""
# Type "TEST"->Select "No, I'd like to select a platform type"->Type "PLATFORM-TYPE-1"
menu_prompt.side_effect = ["TEST", "2", "PLATFORM-TYPE-1"]
with self.store.session_scope():
self.store.add_to_platform_types("PLATFORM-TYPE-1", self.change_id)
self.store.add_to_platform_types("PLATFORM-TYPE-2", self.change_id)
platform_type = self.resolver.fuzzy_search_platform_type(
self.store, "PLATFORM-1", self.change_id
)
self.assertEqual(platform_type.name, "PLATFORM-TYPE-1")
@patch("pepys_import.resolvers.command_line_resolver.create_menu")
def test_cancelling_fuzzy_search_platform_type(self, menu_prompt):
"""Test whether "." returns to the resolve platform type"""
menu_prompt.side_effect = [".", ".", "TEST", ".", "."]
with self.store.session_scope():
temp_output = StringIO()
# Select "."->Select "."
with redirect_stdout(temp_output):
platform_type = self.resolver.fuzzy_search_platform_type(
self.store, "PLATFORM-1", self.change_id
)
output = temp_output.getvalue()
self.assertIn("Returning to the previous menu", output)
self.assertIsNone(platform_type)
# Search "TEST"->Select "."->Select "."
with redirect_stdout(temp_output):
platform_type = self.resolver.fuzzy_search_platform_type(
self.store, "PLATFORM-1", self.change_id
)
output = temp_output.getvalue()
self.assertIn("Returning to the previous menu", output)
self.assertIsNone(platform_type)
@patch("pepys_import.resolvers.command_line_resolver.create_menu")
def test_cancelling_resolve_platform_type(self, menu_prompt):
"""Test whether "." cancels the resolve platform type and returns None"""
menu_prompt.side_effect = ["."]
with self.store.session_scope():
platform_type = self.resolver.resolve_platform_type(
self.store, "PLATFORM-1", self.change_id
)
self.assertIsNone(platform_type)
class DatafileTypeTestCase(unittest.TestCase):
def setUp(self) -> None:
self.resolver = CommandLineResolver()
self.store = DataStore(
"",
"",
"",
0,
":memory:",
db_type="sqlite",
missing_data_resolver=self.resolver,
)
self.store.initialise()
with self.store.session_scope():
self.change_id = self.store.add_to_changes(
"TEST", datetime.utcnow(), "TEST"
).change_id
@patch("pepys_import.resolvers.command_line_resolver.create_menu")
def test_fuzzy_search_add_new_datafile_type(self, menu_prompt):
"""Test whether a new Datafile Type is added or not"""
# Type "TEST"->Select "Yes"
menu_prompt.side_effect = ["TEST", "1"]
with self.store.session_scope():
datafile_type = self.resolver.fuzzy_search_datafile_type(
self.store, "DATAFILE-1", self.change_id
)
self.assertEqual(datafile_type.name, "TEST")
@patch("pepys_import.resolvers.command_line_resolver.create_menu")
def test_fuzzy_search_datafile_type_recursive(self, menu_prompt):
"""Test whether recursive call works for Datafile Type"""
# Type "TEST"->Select "No, I'd like to select a datafile type"->Type "DATAFILE-TYPE-1"
menu_prompt.side_effect = ["TEST", "2", "DATAFILE-TYPE-1"]
with self.store.session_scope():
self.store.add_to_datafile_types("DATAFILE-TYPE-1", self.change_id)
self.store.add_to_datafile_types("DATAFILE-TYPE-2", self.change_id)
datafile_type = self.resolver.fuzzy_search_datafile_type(
self.store, "DATAFILE-1", self.change_id
)
self.assertEqual(datafile_type.name, "DATAFILE-TYPE-1")
@patch("pepys_import.resolvers.command_line_resolver.create_menu")
def test_cancelling_fuzzy_search_datafile_type(self, menu_prompt):
"""Test whether "." returns to the resolve datafile type"""
menu_prompt.side_effect = [".", ".", "TEST", ".", "."]
temp_output = StringIO()
with self.store.session_scope():
# Select "."->Select "."
with redirect_stdout(temp_output):
datafile_type = self.resolver.fuzzy_search_datafile_type(
self.store, "DATAFILE-1", self.change_id
)
output = temp_output.getvalue()
self.assertIn("Returning to the previous menu", output)
self.assertIsNone(datafile_type)
# Type "TEST"->Select "."->Select "."
with redirect_stdout(temp_output):
datafile_type = self.resolver.fuzzy_search_datafile_type(
self.store, "DATAFILE-1", self.change_id
)
output = temp_output.getvalue()
self.assertIn("Returning to the previous menu", output)
self.assertIsNone(datafile_type)
@patch("pepys_import.resolvers.command_line_resolver.create_menu")
@patch("pepys_import.resolvers.command_line_resolver.prompt")
def test_resolve_datafile_type_add_new_datafile_type(
self, resolver_prompt, menu_prompt
):
"""Test whether a new Datafile Type is added or not"""
# Select "Add a new datafile type" -> Type "TEST"
menu_prompt.side_effect = ["2"]
resolver_prompt.side_effect = ["TEST"]
with self.store.session_scope():
datafile_type = self.resolver.resolve_datafile_type(
self.store, "DATAFILE-1", self.change_id
)
self.assertEqual(datafile_type.name, "TEST")
@patch("pepys_import.resolvers.command_line_resolver.create_menu")
def test_cancelling_resolve_datafile_type(self, menu_prompt):
"""Test whether "." cancels the resolve datafile type and returns None"""
menu_prompt.side_effect = ["."]
with self.store.session_scope():
datafile_type = self.resolver.resolve_datafile_type(
self.store, "DATAFILE-1", self.change_id
)
self.assertIsNone(datafile_type)
class SensorTypeTestCase(unittest.TestCase):
def setUp(self) -> None:
self.resolver = CommandLineResolver()
self.store = DataStore(
"",
"",
"",
0,
":memory:",
db_type="sqlite",
missing_data_resolver=self.resolver,
)
self.store.initialise()
with self.store.session_scope():
self.change_id = self.store.add_to_changes(
"TEST", datetime.utcnow(), "TEST"
).change_id
@patch("pepys_import.resolvers.command_line_resolver.create_menu")
def test_fuzzy_search_add_new_sensor_type(self, menu_prompt):
"""Test whether a new Sensor Type is added or not"""
# Type "TEST"->Select "Yes"
menu_prompt.side_effect = ["TEST", "1"]
with self.store.session_scope():
sensor_type = self.resolver.fuzzy_search_sensor_type(
self.store, "SENSOR-1", self.change_id
)
self.assertEqual(sensor_type.name, "TEST")
@patch("pepys_import.resolvers.command_line_resolver.create_menu")
def test_fuzzy_search_sensor_type_recursive(self, menu_prompt):
"""Test whether recursive call works for Sensor Type"""
# Type "TEST"->Select "No, I'd like to select a sensor type"->Type "SENSOR-TYPE-1"
menu_prompt.side_effect = ["TEST", "2", "SENSOR-TYPE-1"]
with self.store.session_scope():
self.store.add_to_sensor_types("SENSOR-TYPE-1", self.change_id)
self.store.add_to_sensor_types("SENSOR-TYPE-2", self.change_id)
sensor_type = self.resolver.fuzzy_search_sensor_type(
self.store, "SENSOR-1", self.change_id
)
self.assertEqual(sensor_type.name, "SENSOR-TYPE-1")
@patch("pepys_import.resolvers.command_line_resolver.create_menu")
def test_cancelling_fuzzy_search_sensor_type(self, menu_prompt):
"""Test whether "." returns to the resolver sensor type"""
menu_prompt.side_effect = [".", ".", "TEST", ".", "."]
with self.store.session_scope():
temp_output = StringIO()
# Select "."->Select "."
with redirect_stdout(temp_output):
sensor_type = self.resolver.fuzzy_search_sensor_type(
self.store, "SENSOR-1", self.change_id
)
output = temp_output.getvalue()
self.assertIn("Returning to the previous menu", output)
self.assertIsNone(sensor_type)
# Type "TEST"->Select "."->Select "."
with redirect_stdout(temp_output):
sensor_type = self.resolver.fuzzy_search_sensor_type(
self.store, "SENSOR-1", self.change_id
)
output = temp_output.getvalue()
self.assertIn("Returning to the previous menu", output)
self.assertIsNone(sensor_type)
@patch("pepys_import.resolvers.command_line_resolver.create_menu")
def test_cancelling_resolve_sensor_type(self, menu_prompt):
"""Test whether "." cancels the resolve sensor type and returns None"""
menu_prompt.side_effect = ["."]
with self.store.session_scope():
sensor_type = self.resolver.resolve_sensor_type(
self.store, "SENSOR-1", self.change_id
)
self.assertIsNone(sensor_type)
class PlatformTestCase(unittest.TestCase):
def setUp(self) -> None:
self.resolver = CommandLineResolver()
self.store = DataStore(
"",
"",
"",
0,
":memory:",
db_type="sqlite",
missing_data_resolver=self.resolver,
)
self.store.initialise()
with self.store.session_scope():
self.change_id = self.store.add_to_changes(
"TEST", datetime.utcnow(), "TEST"
).change_id
@patch("pepys_import.resolvers.command_line_resolver.create_menu")
def test_fuzzy_search_add_platform_to_synonym(self, menu_prompt):
"""Test whether entered platform name is added as a synonym or not"""
# Search "PLATFORM-1"->Select "Yes"
menu_prompt.side_effect = ["PLATFORM-1", "1"]
with self.store.session_scope():
privacy = self.store.add_to_privacies("PRIVACY-1", self.change_id)
platform_type = self.store.add_to_platform_types("Warship", self.change_id)
nationality = self.store.add_to_nationalities("UK", self.change_id)
platform = self.store.get_platform(
"PLATFORM-1",
nationality=nationality.name,
platform_type=platform_type.name,
privacy=privacy.name,
change_id=self.change_id,
)
synonym_platform = self.resolver.fuzzy_search_platform(
self.store,
"TEST",
nationality=nationality.name,
platform_type=platform_type.name,
privacy=privacy.name,
change_id=self.change_id,
)
self.assertEqual(platform.platform_id, synonym_platform.platform_id)
@patch("pepys_import.resolvers.command_line_input.prompt")
@patch("pepys_import.resolvers.command_line_resolver.prompt")
def test_fuzzy_search_add_new_platform(self, resolver_prompt, menu_prompt):
"""Test whether a new platform entity is created or not"""
# Search "PLATFORM-1"->Select "No"->Type name/trigraph/quadgraph/pennant number->Select "Yes"
menu_prompt.side_effect = ["PLATFORM-1", "2", "1"]
resolver_prompt.side_effect = ["TEST", "TST", "TEST", "123"]
with self.store.session_scope():
privacy = self.store.add_to_privacies("PRIVACY-1", self.change_id)
platform_type = self.store.add_to_platform_types("Warship", self.change_id)
nationality = self.store.add_to_nationalities("UK", self.change_id)
self.store.get_platform(
"PLATFORM-1",
trigraph="PL1",
quadgraph="PLT1",
pennant_number="123",
nationality=nationality.name,
platform_type=platform_type.name,
privacy=privacy.name,
change_id=self.change_id,
)
(
platform_name,
trigraph,
quadgraph,
pennant_number,
platform_type,
nationality,
privacy,
) = self.resolver.fuzzy_search_platform(
self.store,
"TEST",
nationality=nationality.name,
platform_type=platform_type.name,
privacy=privacy.name,
change_id=self.change_id,
)
self.assertEqual(platform_name, "TEST")
self.assertEqual(trigraph, "TST")
self.assertEqual(quadgraph, "TEST")
self.assertEqual(pennant_number, "123")
@patch("pepys_import.resolvers.command_line_resolver.create_menu")
@patch("pepys_import.resolvers.command_line_resolver.prompt")
def test_resolver_platform_with_fuzzy_searches(
self, resolver_platform, menu_prompt
):
"""Test whether correct entities return when fuzzy search for platform type, nationality and privacy are
called"""
# Select "Search for existing platform"->Type "TEST"->Type name/trigraph/quadgraph/pennant number->Select
# "Search for an existing nationality"->Select "UK"->Select "Search for an existing platform type"->Select
# "Warship"->Select "Search for an existing classification"->Select "PRIVACY-1"->Select "Yes"
menu_prompt.side_effect = [
"1",
"TEST",
"1",
"UK",
"1",
"Warship",
"1",
"PRIVACY-1",
"1",
]
resolver_platform.side_effect = ["TEST", "TST", "TEST", "123"]
with self.store.session_scope():
self.store.add_to_privacies("PRIVACY-1", self.change_id)
self.store.add_to_platform_types("Warship", self.change_id)
self.store.add_to_nationalities("UK", self.change_id)
(
platform_name,
trigraph,
quadgraph,
pennant_number,
platform_type,
nationality,
privacy,
) = self.resolver.resolve_platform(
data_store=self.store,
platform_name="TEST",
platform_type=None,
nationality=None,
privacy=None,
change_id=self.change_id,
)
self.assertEqual(platform_name, "TEST")
self.assertEqual(trigraph, "TST")
self.assertEqual(quadgraph, "TEST")
self.assertEqual(pennant_number, "123")
self.assertEqual(platform_type.name, "Warship")
self.assertEqual(nationality.name, "UK")
self.assertEqual(privacy.name, "PRIVACY-1")
@patch("pepys_import.resolvers.command_line_resolver.create_menu")
@patch("pepys_import.resolvers.command_line_resolver.prompt")
def test_resolver_platform_with_new_values(self, resolver_prompt, menu_prompt):
"""Test whether new platform type, nationality and privacy entities are created for Platform or not"""
# Select "Add a new platform"->Type name/trigraph/quadgraph/pennant number->Select "Add a new nationality"->
# Select "UK"->Select "Add a new platform type"->Select "Warship"->Select "Add a new classification"->Select
# "PRIVACY-1"->Select "Yes"
menu_prompt.side_effect = ["2", "2", "2", "2", "1"]
resolver_prompt.side_effect = [
"TEST",
"TST",
"TEST",
"123",
"UK",
"Warship",
"PRIVACY-1",
]
with self.store.session_scope():
(
platform_name,
trigraph,
quadgraph,
pennant_number,
platform_type,
nationality,
privacy,
) = self.resolver.resolve_platform(
data_store=self.store,
platform_name="TEST",
platform_type=None,
nationality=None,
privacy=None,
change_id=self.change_id,
)
self.assertEqual(platform_name, "TEST")
self.assertEqual(trigraph, "TST")
self.assertEqual(quadgraph, "TEST")
self.assertEqual(pennant_number, "123")
self.assertEqual(platform_type.name, "Warship")
self.assertEqual(nationality.name, "UK")
self.assertEqual(privacy.name, "PRIVACY-1")
@patch("pepys_import.resolvers.command_line_resolver.create_menu")
@patch("pepys_import.resolvers.command_line_resolver.prompt")
def test_resolver_platform_edit_given_values(self, resolver_prompt, menu_prompt):
"""Test a new platform is created after make further edits option is selected"""
# Select "Add a new platform"->Type name/trigraph/quadgraph/pennant number->Select "No"->
# Type name/trigraph/quadgraph/pennant number->Select "Search for an existing nationality"->Select
# "UK"->Select "Search for an existing platform type"->Select "Warship"->Select "Search for an existing
# classification"->Select "PRIVACY-1"->Select "Yes"
menu_prompt.side_effect = [
"2",
"2",
"1",
"UK",
"1",
"Warship",
"1",
"PRIVACY-1",
"1",
]
resolver_prompt.side_effect = [
"TEST",
"TST",
"TEST",
"123",
"TEST",
"TST",
"TEST",
"123",
]
with self.store.session_scope():
privacy = self.store.add_to_privacies("PRIVACY-1", self.change_id).name
platform_type = self.store.add_to_platform_types(
"Warship", self.change_id
).name
nationality = self.store.add_to_nationalities("UK", self.change_id).name
(
platform_name,
trigraph,
quadgraph,
pennant_number,
platform_type,
nationality,
privacy,
) = self.resolver.resolve_platform(
data_store=self.store,
platform_name="TEST",
platform_type=platform_type,
nationality=nationality,
privacy=privacy,
change_id=self.change_id,
)
self.assertEqual(platform_name, "TEST")
self.assertEqual(trigraph, "TST")
self.assertEqual(quadgraph, "TEST")
self.assertEqual(pennant_number, "123")
self.assertEqual(platform_type.name, "Warship")
self.assertEqual(nationality.name, "UK")
self.assertEqual(privacy.name, "PRIVACY-1")
class DatafileTestCase(unittest.TestCase):
def setUp(self) -> None:
self.resolver = CommandLineResolver()
self.store = DataStore(
"",
"",
"",
0,
":memory:",
db_type="sqlite",
missing_data_resolver=self.resolver,
)
self.store.initialise()
with self.store.session_scope():
self.change_id = self.store.add_to_changes(
"TEST", datetime.utcnow(), "TEST"
).change_id
@patch("pepys_import.resolvers.command_line_resolver.create_menu")
@patch("pepys_import.resolvers.command_line_resolver.prompt")
def test_resolver_datafile_edit_given_values(self, resolver_prompt, menu_prompt):
"""Test whether correct datafile type and privacy returns after resolver is further edited"""
# Select "Add a new datafile"->Type "TEST"->Select "No"->Type "TEST"->
# Select "Search for an existing datafile-type"->Search "DATAFILE-TYPE-2"->
# Select "Search for an existing classification"->Search "PRIVACY-2"->Select "Yes"
menu_prompt.side_effect = [
"2",
"2",
"1",
"DATAFILE-TYPE-2",
"1",
"PRIVACY-2",
"1",
]
resolver_prompt.side_effect = ["TEST", "TEST"]
with self.store.session_scope():
privacy = self.store.add_to_privacies("PRIVACY-1", self.change_id).name
privacy_2 = self.store.add_to_privacies("PRIVACY-2", self.change_id).name
datafile_type = self.store.add_to_datafile_types(
"DATAFILE-TYPE-1", self.change_id
).name
datafile_type_2 = self.store.add_to_datafile_types(
"DATAFILE-TYPE-2", self.change_id
).name
(datafile_name, datafile_type, privacy,) = self.resolver.resolve_datafile(
data_store=self.store,
datafile_name="TEST",
datafile_type=datafile_type,
privacy=privacy,
change_id=self.change_id,
)
self.assertEqual(datafile_name, "TEST")
self.assertEqual(datafile_type.name, "DATAFILE-TYPE-2")
self.assertEqual(privacy.name, "PRIVACY-2")
@patch("pepys_import.resolvers.command_line_resolver.create_menu")
@patch("pepys_import.resolvers.command_line_resolver.prompt")
def test_resolver_datafile_add_new_datafile(self, resolver_prompt, menu_prompt):
"""Test whether the correct datafile type and privacy entities are returned after searched and not found in
Datafile Table."""
# Select "Search for an existing Datafile"->Search "DATAFILE-1"->Type "TEST"->
# Select "Search for an existing datafile type"->Search "DATAFILE-TYPE-1"->
# Select "Search for an existing classification"->Search "PRIVACY-1->Select "Yes"
menu_prompt.side_effect = [
"1",
"DATAFILE-1",
"1",
"DATAFILE-TYPE-1",
"1",
"PRIVACY-1",
"1",
]
resolver_prompt.side_effect = ["TEST"]
with self.store.session_scope():
self.store.add_to_privacies("PRIVACY-1", self.change_id)
self.store.add_to_datafile_types("DATAFILE-TYPE-1", self.change_id)
datafile_name, datafile_type, privacy = self.resolver.resolve_datafile(
data_store=self.store,
datafile_name="TEST",
datafile_type=None,
privacy=None,
change_id=self.change_id,
)
self.assertEqual(datafile_name, "TEST")
self.assertEqual(datafile_type.name, "DATAFILE-TYPE-1")
self.assertEqual(privacy.name, "PRIVACY-1")
@patch("pepys_import.resolvers.command_line_resolver.create_menu")
def test_resolve_datafile_add_to_synonyms(self, menu_prompt):
"""Test whether the given datafile name is correctly added to Synonyms table or not"""
# Select "Search an existing datafile"->Search "DATAFILE-1"->Select "Yes"
menu_prompt.side_effect = ["1", "DATAFILE-1", "1"]
with self.store.session_scope():
privacy = self.store.add_to_privacies("PRIVACY-1", self.change_id)
datafile_type = self.store.add_to_datafile_types(
"DATAFILE-TYPE-1", self.change_id
)
datafile = self.store.add_to_datafiles(
file_type=datafile_type.name,
privacy=privacy.name,
reference="DATAFILE-1",
change_id=self.change_id,
)
synonym_datafile = self.resolver.resolve_datafile(
data_store=self.store,
datafile_name="TEST",
datafile_type=datafile_type.name,
privacy=privacy.name,
change_id=self.change_id,
)
self.assertEqual(synonym_datafile.datafile_id, datafile.datafile_id)
@patch("pepys_import.resolvers.command_line_resolver.create_menu")
@patch("pepys_import.resolvers.command_line_resolver.prompt")
def test_fuzzy_search_datafile_add_new_datafile(self, resolver_prompt, menu_prompt):
"""Test whether a new datafile is created or not after searched and not founded in Datafile Table."""
# Search "DATAFILE-1"->Type "TEST"->Select "No"->Select "Yes"
menu_prompt.side_effect = [
"DATAFILE-1",
"2",
"1",
]
resolver_prompt.side_effect = ["TEST"]
with self.store.session_scope():
privacy = self.store.add_to_privacies("PRIVACY-1", self.change_id).name
datafile_type = self.store.add_to_datafile_types(
"DATAFILE-TYPE-1", self.change_id
).name
self.store.add_to_datafiles(
file_type=datafile_type,
privacy=privacy,
reference="DATAFILE-1",
change_id=self.change_id,
)
datafile_name, datafile_type, privacy = self.resolver.fuzzy_search_datafile(
data_store=self.store,
datafile_name="TEST",
datafile_type=datafile_type,
privacy=privacy,
change_id=self.change_id,
)
self.assertEqual(datafile_name, "TEST")
self.assertEqual(datafile_type.name, "DATAFILE-TYPE-1")
self.assertEqual(privacy.name, "PRIVACY-1")
class SensorTestCase(unittest.TestCase):
def setUp(self) -> None:
self.resolver = CommandLineResolver()
self.store = DataStore(
"",
"",
"",
0,
":memory:",
db_type="sqlite",
missing_data_resolver=self.resolver,
)
self.store.initialise()
with self.store.session_scope():
self.change_id = self.store.add_to_changes(
"TEST", datetime.utcnow(), "TEST"
).change_id
@patch("pepys_import.resolvers.command_line_resolver.create_menu")
@patch("pepys_import.resolvers.command_line_resolver.prompt")
def test_resolve_sensor(self, resolver_prompt, menu_prompt):
"""Test whether correct sensor type and privacy entities are resolved or not"""
# Select "Add a new sensor"->Type "TEST"->Select "Add a new sensor-type"->
# Type "SENSOR-TYPE-1"->Select "Add a new classification"->Type "PRIVACY-1"->Select "Yes"
menu_prompt.side_effect = ["2", "2", "2", "1"]
resolver_prompt.side_effect = ["TEST", "SENSOR-TYPE-1", "PRIVACY-1"]
with self.store.session_scope():
self.store.add_to_sensor_types("SENSOR-TYPE-1", self.change_id)
self.store.add_to_privacies("PRIVACY-1", self.change_id)
sensor_name, sensor_type, privacy = self.resolver.resolve_sensor(
self.store,
"TEST",
sensor_type=None,
privacy=None,
change_id=self.change_id,
)
self.assertEqual(sensor_name, "TEST")
self.assertEqual(sensor_type.name, "SENSOR-TYPE-1")
self.assertEqual(privacy.name, "PRIVACY-1")
@patch("pepys_import.resolvers.command_line_resolver.create_menu")
def test_resolve_sensor_add_to_synonyms(self, menu_prompt):
"""Test whether the given sensor name is correctly added to Synonyms table or not"""
# Select "Search an existing sensor"->Search "SENSOR-1"->Select "Yes"
menu_prompt.side_effect = ["1", "SENSOR-1", "1"]
with self.store.session_scope():
# Create platform first, then create a Sensor object
sensor_type = self.store.add_to_sensor_types(
"SENSOR-TYPE-1", self.change_id
)
privacy = self.store.add_to_privacies("PRIVACY-1", self.change_id).name
nationality = self.store.add_to_nationalities("UK", self.change_id).name
platform_type = self.store.add_to_platform_types(
"PLATFORM-TYPE-1", self.change_id
).name
platform = self.store.get_platform(
platform_name="Test Platform",
nationality=nationality,
platform_type=platform_type,
privacy=privacy,
change_id=self.change_id,
)
sensor = platform.get_sensor(
self.store, "SENSOR-1", sensor_type, privacy, change_id=self.change_id
)
synonym_sensor = self.resolver.resolve_sensor(
self.store,
"SENSOR-TEST",
sensor_type,
privacy,
change_id=self.change_id,
)
self.assertEqual(synonym_sensor.sensor_id, sensor.sensor_id)
@patch("pepys_import.resolvers.command_line_resolver.create_menu")
@patch("pepys_import.resolvers.command_line_resolver.prompt")
def test_resolver_sensor_make_further_edit(self, resolver_prompt, menu_prompt):
"""Test whether correct sensor type and privacy returns after resolver is further edited"""
# Select "Add a new sensor"->Type "TEST"->Select "No"->Type "TEST"->
# Select "Search for an existing sensor-type"->Search "SENSOR-TYPE-2"->
# Select "Search for an existing classification"->Search "PRIVACY-2"->Select "Yes"
menu_prompt.side_effect = [
"2",
"2",
"1",
"SENSOR-TYPE-2",
"1",
"PRIVACY-2",
"1",
]
resolver_prompt.side_effect = ["TEST", "TEST"]
with self.store.session_scope():
sensor_type = self.store.add_to_sensor_types(
"SENSOR-TYPE-1", self.change_id
)
sensor_type_2 = self.store.add_to_sensor_types(
"SENSOR-TYPE-2", self.change_id
)
privacy = self.store.add_to_privacies("PRIVACY-1", self.change_id)
privacy_2 = self.store.add_to_privacies("PRIVACY-2", self.change_id)
(
resolved_name,
resolved_type,
resolved_privacy,
) = self.resolver.resolve_sensor(
self.store, "TEST", sensor_type.name, privacy.name, self.change_id
)
self.assertEqual(resolved_name, "TEST")
self.assertEqual(resolved_type.sensor_type_id, sensor_type_2.sensor_type_id)
self.assertEqual(resolved_privacy.privacy_id, privacy_2.privacy_id)
@patch("pepys_import.resolvers.command_line_resolver.create_menu")
@patch("pepys_import.resolvers.command_line_resolver.prompt")
def test_fuzzy_search_add_sensor(self, resolver_prompt, menu_prompt):
"""Test whether a new Sensor entity created or not after searched
and not founded in the Sensor Table."""
# Select "Search an existing sensor"->Search "SENSOR-1"->Select "No"->Type "SENSOR-TEST"->
# Select "Search for an existing sensor-type"->Search "SENSOR-TYPE-1"->
# Select "Search an existing classification"->Search "PRIVACY-1"->Select "Yes"
menu_prompt.side_effect = [
"1",
"SENSOR-1",
"2",
"1",
"SENSOR-TYPE-1",
"1",
"PRIVACY-1",
"1",
]
resolver_prompt.side_effect = ["SENSOR-TEST"]
with self.store.session_scope():
# Create platform first, then create a Sensor object
sensor_type = self.store.add_to_sensor_types(
"SENSOR-TYPE-1", self.change_id
)
privacy = self.store.add_to_privacies("PRIVACY-1", self.change_id).name
nationality = self.store.add_to_nationalities("UK", self.change_id).name
platform_type = self.store.add_to_platform_types(
"PLATFORM-TYPE-1", self.change_id
).name
platform = self.store.get_platform(
platform_name="Test Platform",
nationality=nationality,
platform_type=platform_type,
privacy=privacy,
change_id=self.change_id,
)
platform.get_sensor(
self.store, "SENSOR-1", sensor_type, privacy, self.change_id
)
platform.get_sensor(
self.store, "SENSOR-2", sensor_type, privacy, self.change_id
)
sensor_name, sensor_type, privacy = self.resolver.resolve_sensor(
self.store,
"SENSOR-TEST",
sensor_type=None,
privacy=None,
change_id=self.change_id,
)
self.assertEqual(sensor_name, "SENSOR-TEST")
self.assertEqual(sensor_type.name, "SENSOR-TYPE-1")
self.assertEqual(privacy.name, "PRIVACY-1")
@patch("pepys_import.resolvers.command_line_resolver.create_menu")
@patch("pepys_import.resolvers.command_line_resolver.prompt")
def test_fuzzy_search_add_sensor_alternative(self, resolver_prompt, menu_prompt):
"""Test whether a new Sensor entity created when the Sensor Table is empty."""
# Select "Search an existing sensor"->Search "SENSOR-1"->Type "SENSOR-TEST"->Select "Yes"
menu_prompt.side_effect = [
"1",
"SENSOR-1",
"1",
]
resolver_prompt.side_effect = ["SENSOR-TEST"]
with self.store.session_scope():
sensor_type = self.store.add_to_sensor_types(
"SENSOR-TYPE-1", self.change_id
)
privacy = self.store.add_to_privacies("PRIVACY-1", self.change_id).name
sensor_name, sensor_type, privacy = self.resolver.resolve_sensor(
self.store,
"SENSOR-TEST",
sensor_type=sensor_type.name,
privacy=privacy,
change_id=self.change_id,
)
self.assertEqual(sensor_name, "SENSOR-TEST")
class CancellingAndReturnPreviousMenuTestCase(unittest.TestCase):
def setUp(self) -> None:
self.resolver = CommandLineResolver()
self.store = DataStore(
"",
"",
"",
0,
":memory:",
db_type="sqlite",
missing_data_resolver=self.resolver,
)
self.store.initialise()
with self.store.session_scope():
self.change_id = self.store.add_to_changes(
"TEST", datetime.utcnow(), "TEST"
).change_id
@patch("pepys_import.resolvers.command_line_resolver.create_menu")
def test_top_level_quitting(self, menu_prompt):
"""Test whether "." quits from the resolve platform/datafile/sensor"""
menu_prompt.side_effect = [".", ".", "."]
with self.store.session_scope():
with self.assertRaises(SystemExit):
self.resolver.resolve_datafile(self.store, "", "", "", self.change_id),
with self.assertRaises(SystemExit):
self.resolver.resolve_platform(
self.store, "", "", "", "", self.change_id
)
with self.assertRaises(SystemExit):
self.resolver.resolve_sensor(self.store, "", "", "", self.change_id)
@patch("pepys_import.resolvers.command_line_resolver.create_menu")
def test_cancelling_fuzzy_search_datafile(self, menu_prompt):
"""Test whether "." returns to resolve datafile"""
# Type "DATAFILE-1"->Select "."->Select "."->Select "."
menu_prompt.side_effect = ["DATAFILE-1", ".", ".", "."]
with self.store.session_scope():
privacy = self.store.add_to_privacies("PRIVACY-1", self.change_id)
datafile_type = self.store.add_to_datafile_types(
"DATAFILE-TYPE-1", self.change_id
)
self.store.add_to_datafiles(
file_type=datafile_type.name,
privacy=privacy.name,
reference="DATAFILE-1",
change_id=self.change_id,
)
with self.assertRaises(SystemExit):
self.resolver.fuzzy_search_datafile(
self.store, "TEST", "", "", self.change_id
)
@patch("pepys_import.resolvers.command_line_resolver.create_menu")
def test_cancelling_fuzzy_search_platform(self, menu_prompt):
"""Test whether "." returns to resolve platform"""
# Search "PLATFORM-1"->Select "."->Select "."->Select "."
menu_prompt.side_effect = ["PLATFORM-1", ".", ".", "."]
with self.store.session_scope():
privacy = self.store.add_to_privacies("PRIVACY-1", self.change_id)
platform_type = self.store.add_to_platform_types("Warship", self.change_id)
nationality = self.store.add_to_nationalities("UK", self.change_id)
self.store.get_platform(
"PLATFORM-1",
nationality=nationality.name,
platform_type=platform_type.name,
privacy=privacy.name,
change_id=self.change_id,
)
with self.assertRaises(SystemExit):
self.resolver.fuzzy_search_platform(
self.store, "TEST", "", "", "", self.change_id
)
@patch("pepys_import.resolvers.command_line_resolver.create_menu")
def test_cancelling_fuzzy_search_sensor(self, menu_prompt):
"""Test whether "." returns to resolve sensor"""
# Type "SENSOR-1"->Select "."->Select "."->Select "."
menu_prompt.side_effect = ["SENSOR-1", ".", ".", "."]
with self.store.session_scope():
sensor_type = self.store.add_to_sensor_types(
"SENSOR-TYPE-1", self.change_id
)
privacy = self.store.add_to_privacies("PRIVACY-1", self.change_id).name
nationality = self.store.add_to_nationalities("UK", self.change_id).name
platform_type = self.store.add_to_platform_types(
"PLATFORM-TYPE-1", self.change_id
).name
platform = self.store.get_platform(
platform_name="Test Platform",
nationality=nationality,
platform_type=platform_type,
privacy=privacy,
change_id=self.change_id,
)
platform.get_sensor(
self.store, "SENSOR-1", sensor_type, privacy, self.change_id
)
with self.assertRaises(SystemExit):
self.resolver.fuzzy_search_sensor(
self.store, "TEST", "", "", self.change_id
)
@patch("pepys_import.resolvers.command_line_resolver.create_menu")
@patch("pepys_import.resolvers.command_line_resolver.prompt")
def test_cancelling_during_add_to_platforms(self, resolver_prompt, menu_prompt):
menu_prompt.side_effect = [
".",
".",
"2",
".",
".",
"2",
"2",
".",
".",
"2",
"2",
"2",
".",
".",
]
resolver_prompt.side_effect = [
"TEST",
"TST",
"TEST",
"123",
"TEST",
"TST",
"TEST",
"123",
"UK",
"TEST",
"TST",
"TEST",
"123",
"UK",
"TYPE-1",
"TEST",
"TST",
"TEST",
"123",
"UK",
"TYPE-1",
"PRIVACY-1",
]
with self.store.session_scope():
# Type name/trigraph/quadgraph/pennant number->Select "Cancel nationality search"->Select "Cancel import"
with self.assertRaises(SystemExit):
self.resolver.add_to_platforms(
self.store, "PLATFORM-1", "", "", "", self.change_id
)
# Type name/trigraph/quadgraph/pennant number->Select "Add new nationality"->Type "UK"->Select
# "Cancel platform type search"->Select "Cancel import"
with self.assertRaises(SystemExit):
self.resolver.add_to_platforms(
self.store, "PLATFORM-1", "", "", "", self.change_id
)
# Type name/trigraph/quadgraph/pennant number->Select "Add new nationality"->Type "UK"->
# Select "Add a new platform type"->Type "TYPE-1"->Select "Cancel classification search"->Select "Cancel
# import"
with self.assertRaises(SystemExit):
self.resolver.add_to_platforms(
self.store, "PLATFORM-1", "", "", "", self.change_id
)
# Type name/trigraph/quadgraph/pennant number->Select "Add new nationality"->Type "UK"->
# Select "Add a new platform type"->Select "Add new classification"->Type "PRIVACY-1"->Type
# "TYPE-1"->Select "Cancel import"->Select "Cancel import"
with self.assertRaises(SystemExit):
self.resolver.add_to_platforms(
self.store, "PLATFORM-1", "", "", "", self.change_id
)
@patch("pepys_import.resolvers.command_line_resolver.create_menu")
@patch("pepys_import.resolvers.command_line_resolver.prompt")
def test_cancelling_during_add_to_datafiles(self, resolver_prompt, menu_prompt):
menu_prompt.side_effect = [".", ".", "2", ".", ".", "2", "2", ".", "."]
resolver_prompt.side_effect = [
"TEST",
"TEST",
"DATAFILE-TYPE-1",
"TEST",
"DATAFILE-TYPE-1",
"PRIVACY-1",
]
with self.store.session_scope():
# Type "TEST"->Select "Cancel datafile type search"->Select "Cancel import"
with self.assertRaises(SystemExit):
self.resolver.add_to_datafiles(
self.store, "DATAFILE-1", "", "", self.change_id
)
# Type "TEST"->Select "Add a new datafile type"->Type "DATAFILE-TYPE-1->
# Select "Cancel classification search" ->Select "Cancel import"
with self.assertRaises(SystemExit):
self.resolver.add_to_datafiles(
self.store, "DATAFILE-1", "", "", self.change_id
)
# Type "TEST"->Select "Add a new datafile type"->Type "DATAFILE-TYPE-1->
# Select "Add a new classification"->Type "PRIVACY-1"->Select "Cancel import"->Select "Cancel import"
with self.assertRaises(SystemExit):
self.resolver.add_to_datafiles(
self.store, "DATAFILE-1", "", "", self.change_id
)
@patch("pepys_import.resolvers.command_line_resolver.create_menu")
@patch("pepys_import.resolvers.command_line_resolver.prompt")
def test_cancelling_during_add_to_sensors(self, resolver_prompt, menu_prompt):
menu_prompt.side_effect = [".", ".", "2", ".", ".", "2", "2", ".", "."]
resolver_prompt.side_effect = [
"TEST",
"TEST",
"SENSOR-TYPE-1",
"TEST",
"SENSOR-TYPE-1",
"PRIVACY-1",
]
with self.store.session_scope():
# Type "TEST"->Select "Cancel sensor type search"->Select "Cancel import"
with self.assertRaises(SystemExit):
self.resolver.add_to_sensors(
self.store, "SENSOR-1", "", "", self.change_id
)
# Type "TEST"->Select "Add a new sensor type"->Type "SENSOR-TYPE-1->
# Select "Cancel classification search" ->Select "Cancel import"
with self.assertRaises(SystemExit):
self.resolver.add_to_sensors(
self.store, "SENSOR-1", "", "", self.change_id
)
# Type "TEST"->Select "Add a new sensor type"->Type "SENSOR-TYPE-1->
# Select "Add a new classification"->Type "PRIVACY-1"->Select "Cancel import"->Select "Cancel import"
with self.assertRaises(SystemExit):
self.resolver.add_to_sensors(
self.store, "SENSOR-1", "", "", self.change_id
)
class GetMethodsTestCase(unittest.TestCase):
def setUp(self) -> None:
self.file_path = os.path.join(DIR_PATH, "test.db")
self.store = DataStore(
"",
"",
"",
0,
self.file_path,
db_type="sqlite",
missing_data_resolver=CommandLineResolver(),
)
with self.store.session_scope():
self.store.initialise()
self.store.populate_reference()
self.store.populate_metadata()
self.store.populate_measurement()
self.change_id = self.store.add_to_changes(
"TEST", datetime.utcnow(), "TEST"
).change_id
def tearDown(self) -> None:
if os.path.exists(self.file_path):
os.remove(self.file_path)
@patch("pepys_import.resolvers.command_line_resolver.create_menu")
@patch("pepys_import.resolvers.command_line_resolver.prompt")
def test_get_platform_adds_resolved_platform_successfully(
self, resolver_prompt, menu_prompt
):
menu_prompt.side_effect = [
"2",
"1",
"NETHERLANDS",
"1",
"TYPE-1",
"1",
"PRIVACY-1",
"1",
]
resolver_prompt.side_effect = ["Test Platform", "Tst", "Test", "123"]
with self.store.session_scope():
platforms = self.store.session.query(self.store.db_classes.Platform).all()
# there must be 2 entities at the beginning
self.assertEqual(len(platforms), 2)
self.store.get_platform("Test Platform", change_id=self.change_id)
platforms = self.store.session.query(self.store.db_classes.Platform).all()
self.assertEqual(len(platforms), 3)
self.assertEqual(platforms[2].name, "Test Platform")
@patch("pepys_import.resolvers.command_line_resolver.create_menu")
@patch("pepys_import.resolvers.command_line_resolver.prompt")
def test_get_datafile_adds_resolved_datafile_successfully(
self, resolver_prompt, menu_prompt
):
menu_prompt.side_effect = [
"2",
"1",
"DATAFILE-TYPE-1",
"1",
"PRIVACY-1",
"1",
]
resolver_prompt.side_effect = ["DATAFILE-TEST"]
with self.store.session_scope():
datafiles = self.store.session.query(self.store.db_classes.Datafile).all()
# there must be 2 entities at the beginning
self.assertEqual(len(datafiles), 2)
self.store.get_datafile("test", change_id=self.change_id)
datafiles = self.store.session.query(self.store.db_classes.Datafile).all()
self.assertEqual(len(datafiles), 3)
self.assertEqual(datafiles[2].reference, "DATAFILE-TEST")
@patch("pepys_import.resolvers.command_line_resolver.create_menu")
@patch("pepys_import.resolvers.command_line_resolver.prompt")
def test_get_sensor_adds_resolved_sensor_successfully(
self, resolver_prompt, menu_prompt
):
menu_prompt.side_effect = [
"2",
"1",
"SENSOR-TYPE-1",
"1",
"PRIVACY-1",
"1",
]
resolver_prompt.side_effect = ["SENSOR-TEST"]
with self.store.session_scope():
sensors = self.store.session.query(self.store.db_classes.Sensor).all()
# there must be 2 entities at the beginning
self.assertEqual(len(sensors), 2)
platform = self.store.get_platform("PLATFORM-1", change_id=self.change_id)
platform.get_sensor(self.store, "SENSOR-TEST", change_id=self.change_id)
# there must be 3 entities now
sensors = self.store.session.query(self.store.db_classes.Sensor).all()
self.assertEqual(len(sensors), 3)
self.assertEqual(sensors[2].name, "SENSOR-TEST")
if __name__ == "__main__":
unittest.main()
|
1. Technical Field
The present invention relates in general to electronic devices and, in particular, to selectively controlling electronic devices. Still more particularly, the present invention relates to conditionally controlling an electronic device according to conditional criteria designated at the electronic device by a manufacturer, user or authority to the user.
2. Description of the Related Art
Many electronic devices are available that provide users with messaging, telephony and processing. For example, electronic devices may include pervasive devices such as mobile telephones, pagers, wrist watches, personal digital assistants (PDAs) and others. In addition, electronic devices may encompass laptop computers, personal storage devices such as smart cards, compact disc players, radios, and other devices of which use may prove to be obtrusive or un-welcomed in particular areas.
In many cases, such electronic devices include audio transducers that alert a user to an incoming phone call, page or time. Such audible alerts can be helpful to a user, however they may also be obtrusive in certain environments such as a theater, classroom, hospital, etc. In addition, use of such electronic devices, and in particular use of mobile telephones, can be obtrusive in arenas such as an airplane.
Therefore, it has become desirable to include a control circuit within such electronic devices that may be utilized to control both audible alerts and usage of electronic devices. In particular, such control circuits are often designed such that the user does not have any control over conditions that cause audible alerts and usage of electronic devices to be inhibited.
For example, U.S. Pat. Nos. 5,192,947 and 5,224,150 provide that in public places or other controlled zones a localized transmitter can be utilized to disable all audio transducers included in all pagers able to detect an encoded signal transmitted by the localized transmitter within an immediate vicinity. In a particular example, a localized transmitter may emit an encoded signal within an auditorium such that if a person with a pager enabled to detect the encoded signal walks in to the auditorium, the pager""s audio transducer is disabled. U.S. Pat. No. 5,842,112 provides an example where the localized transmitter is either an ultrasonic or infrared transmitter.
In another example, U.S. Pat. No. 5,815,407 provides for inhibiting operation of an electronic device within an airplane during take-off and landing. In particular, a sensor may be included within the electronic device that detects changes in lateral acceleration and inhibits of the electronic device if the detected lateral acceleration exceeds a particular threshold indicative of take-off or landing of an aircraft. In addition, a signal indicating take-off or landing of an aircraft may be transmitted within the aircraft, such that the electronic device detects the signal and automatically inhibits operation of the electronic device.
In yet another example, U.S. Pat. No. 5,907,613 provides a silencing system where a switch is electronically connected in series between a telephone line and a telephone receiver where in one position the telephone receiver is operational and in the other position the telephone receiver is disconnected from the telephone line. A remote button is provided that when pressed transmits a signal to the switch to change positions.
In the examples provided, the electronic devices are equipped with a control system that automatically inhibits the electronic device where the user is preferably not able to override the control system. In addition, in the examples provided, a transmitter sends a signal to the control system of the electronic device that inactivates the electronic device according to the type of signal. However, it would be advantageous to not only inhibit use of an electronic device when within a particular area, such as a theater, but to inhibit use of an electronic device dependent upon multiple types of conditions that may be specified by the user of the electronic device, or by another.
In addition, in view of the foregoing, it would be desirable to not only provide a control system for electronic devices that can receive control signals from a control tower and local transmitter, but also detect the presence of other electronic devices within a particular proximity and conditionally control a level of usage based on the proximity of other electronic devices. It would be desirable to allow a user or other to specify conditions for usage of an electronic device dependent upon the proximity of other electronic devices and other conditions such as time and environment.
In view of the foregoing, it is therefore an object of the present invention to provide an improved electronic device.
It is another object of the present invention to provide a method, system and program for selectively controlling electronic devices.
It is yet another object of the present invention to provide an improved method, system and program for conditionally controlling an electronic device according to conditional criteria designated at the electronic device by a manufacturer, user or authority to the user.
In accordance with the present invention, an electronic device is enabled to detect the proximity of other electronic devices. Multiple proximity based conditions for usage of the electronic device may be provided by a manufacturer, user and other authorities at the electronic device. The proximity of other electronic devices is compared with the proximity based conditions and a level of usage of the electronic device is determined, such that the level of usage of the electronic device is conditionally adjusted according to the proximity of other electronic devices.
All objects, features, and advantages of the present invention will become apparent in the following detailed written description. |
Over 2000 SF finished in this updated 2 story. Home features 3 beds and 3 baths total with a master suite, a large open main level floor plan, updated kitchen with stainless steel appliances, granite tops, and soft close cabinetry. Lower level is partially finished with plenty of storage. Waterproofing and radon mitigation systems in place. 2 car attached garage is heated, insulated and has additional loft storage space. Exterior features newer siding, roof and stone, a good sized deck and storage shed with electrical run to it. Back yard is fenced, tree lined and backs to light timber. Convenient location close to parks, bike trails, shopping and all that Polk City has to offer. Come check it out! |
MS-DIAL 4: accelerating lipidomics using an MS/MS, CCS, and retention time atlas We formulated mass spectral fragmentations of lipids across 117 lipid subclasses and included ion mobility tandem mass spectrometry (MS/MS) to provide a comprehensive lipidome atlas with retention time, collision cross section, and MS/MS information. The all-in-one solution from import of raw MS data to export of a common output format (mztab-M) was packaged in MS-DIAL 4 (http://prime.psc.riken.jp/) providing an enhanced standardized untargeted lipidomics procedure following lipidomics standards initiative (LSI) semi-quantitative definitions and shorthand notation system of lipid structures with a 12% estimated false discovery rate, which will contribute to harmonizing lipidomics data across laboratories to accelerate lipids research. |
A. Background
The development of methods for the sequential degradation of proteins and peptides from the carboxy-terminus has been the objective of several studies. See Ward, C. W., Practical Protein Chemistry--A Handbook (Darbre, A., ed.) (1986) and Rangarajan, M., Protein/Peptide Sequence Analysis: Current (1988). Such a method would complement existing N-terminal degradations based on the Edman chemistry. Edman, P., Acta.Chem.Scand. 4:283-293 (1950). The most widely studied method and probably the most attractive because of its similarity to the Edman degradation has been the conversion of amino acids into thiohydantoins. This reaction, originally observed by Johnson and Nicolet, J.Am.Chem.Soc. 33:1973-1978 (1911), was first applied to the sequential degradation of proteins from the carboxy-terminus by Schlack and Kumpf, Z.Physiol.Chem. 154:125-170 (1926). These authors reacted ammonium thiocyanate, dissolved in acetic acid and acetic anhydride, with N-benzoylated peptides to form carboxyl-terminal 1-acyl-2-thiohydantoins. Exposure to strong base was used to liberate the amino acid thiohydantoin and generate a new carboxyl-terminal amino acid. The main disadvantages of this procedure have been the severity of the conditions required for complete derivatization of the C-terminal amino acid and for the subsequent cleavage of the peptidylthiohydantoin derivative into a new shortened peptide and an amino acid thiohydantoin derivative.
Since this work was published, numerous groups have tried to reduce the severity of the conditions required, particularly in the cleavage of the peptidylthiohydantoin, in order to apply this chemistry to the sequential degradation of proteins from the carboxyl terminal end. Lesser concentrations o sodium hydroxide than originally used by Schlack and Kumpf and of barium hydroxide were found to effectively cleave peptidylthiohydantoins. See Waley, S. G., et al., J.Chem.Soc. 1951:2394-2397 (1951); Kjaer, A., et al., Acta Chem.Scand. 6:448-450 (1952); Turner, R. A., et al., Biochim.Biophys.Acta. 13:553-559 (1954). Other groups used acidic conditions based on the original procedure used by Johnson and Nicolet for the de-acetylation of amino acid thiohydantoins. See Tibbs, J., Nature 168:910 (1951); Baptist, V. H., et al., J.Am.Chem.Soc. 75:1727-1729 (1953). These authors added concentrated hydrochloric acid to the coupling solution to cause cleavage of the peptidylthiohydantoin bond. Unlike hydroxide which was shown to cause breakdown of the thiohydantoin amino acids, hydrochloric acid was shown not to destroy the amino acid thiohydantoins. See Scoffone, E., et al., Ric.Sci. 26:865-871 (1956); Fox, S. W., et al., J.Am.Chem. Soc. 77:3119-3122 (1955); Stark, G. R., Biochem. 7:1796-1807 (1968). Cromwell, L. D., et al., Biochem. 8:4735-4740 (1969) showed that the concentrated hydrochloric acid could be used to cleave the thiohydantoin amino acid at room temperature. The major drawback with this procedure was that when applied to proteins, no more than two or three cycles could be performed.
Yamashita, S., Biochem.Biophys.Acta. 229:301-309 (1971) found that cleavage of peptidylthiohydantoins could be done in a repetitive manner with a protonated cation exchange resin. Application of this procedure to 100 .mu.mol quantities of papain and ribonuclease was reported to give 14 and 10 cycles, respectively, although no details were given. See Yamashita, S., et al., Proc.Hoshi.Pharm. 13:136-138 (1971). Stark reported that certain organic bases, such as morpholine or piperidine, could be substituted for sodium hydroxide, and along the same lines, Kubo, H., et al., Chem.Pharm.Bull. 19:210-211 (1971) reported that aqueous triethylamine (0.5M) could be used to effectively cleave peptidylthiohydantoins. Stark appeared to have solved the cleavage problem by introducing acetohydroxamic acid in aqueous pyridine at pH 8.2 as a cleavage reagent. This reagent was shown to rapidly and specifically cleave peptidylthiohydantoins at room temperature and at mild pH.
Conditions for the formulation of the peptidylthiohydantoins were improved by Stark and Dwulet, F. E., et al., Int.J.Peptide and Protein Res. 13:122-129 (1979), who reported on the use of thiocyanic acid rather than thiocyanate salts, and more recently by the introduction of trimethylsilylisothiocyanate (TMS-ITC) as a coupling reagent. See Hawke, D. H., et al., Anal.Biochem. 166:298-307 (1987). The use of this reagent for C-terminal sequencing has been patented. See Hawke U.S. Pat. No. 4,837,165. This reagent significantly improved the yields of peptidylthiohydantoin formation and reduced the number of complicating side products. Cleavage of peptidylthiohydantoins by 12 N HCl (Hawke, 1987) and by acetohydroxamate (Miller, C. G., et al., Techniques in Protein Chemistry (Hugli, T. E., ed.) pp. 67-68, Academic Press (1989)) failed to yield more than a few cycles of degradation.
B. The Cleavage Problem
Although the cleavage reaction has been extensively studied since the thiocyanate chemistry for C-terminal degradation was first proposed by Schlack and Kumpf in 1926, a chemical method has not yet been proposed that is capable of an extended degradation. Cleavage in 1N sodium hydroxide as first proposed by Schlack and Kumpf (1926) is well known to hydrolyze proteins and peptides at other sites in addition to cleavage of the C-terminal peptidylthiohydantoin. The released thiohydantoin amino acid derivatives are also known to be unstable in hydroxide solutions. Scoffone, supra. Cleavage by hydroxide is known to convert the side chain amide groups of asparagine and glutamine residues to a carboxylic group making these residues indistinguishable from aspartate and glutamate, respectively.
When cleavage of peptidylthiohydantoins by 12N HCl was applied to proteins and peptides no more than 2 or 3 cycles could be performed. See, Cromwell, supra and Hawke, supra. This was probably due to differences in the rate of hydrolysis of peptidylthiohydantoins containing different amino acid side chains as well as to hydrolysis of other internal amide bonds. Likewise, during the synthesis of the standard amino acid thiohydantoin derivatives corresponding to the naturally occurring amino acids, it was observed that the rate of deacetylation of the N-acetylthiohydantoin amino acids by 12 HCl depended on the nature of the amino acid side chain. Bailey, J. M., et al. Biochem. 29:3145-3156 (1990).
Attempts by Dwulet, supra, to reproduce the resin based cleavage method of Yamashita, supra, was reported to be unsuccessful. Cleavage of peptidylthiohydantoins with aqueous methanesulfonic acid was also attempted by Dwulet and by Bailey, et al., both without success. Methanesulfonic acid was chosen since it is equivalent to the acidic group on the resin employed by Yamashita (1971) and Yamashita, et al. (1971).
Cleavage of the peptidylthiohydantoin derivatives with acetohydroxamate as originally reported by Stark, supra, was found to result in the formation of stable hydroxamate esters at the C-terminus of the shortened peptide (Bailey, et al., supra). Depending on the conditions employed, between 68% and 93% of the peptide was derivatized at the C-terminus and thus prevented from further sequencing. Although Stark, supra, predicted such hydroxamate esters to form as an intermediate during cleavage, it was assumed that they would break down under the conditions used for cleavage or continued sequencing. The peptidyl hydroxamate esters formed from cleavage with acetohydroxamate, like the hydroxamate esters studied by Stieglitz, J., et al., J.Am.Chem.Soc. 36:272-301 (1914) and Scott, A. W., et al., J.Am.Chem.Soc. 49:2545-2549 (1927), are stable under the acidic conditions used for thiohydantoin formation and can only be hydrolyzed to a free peptidyl carboxylic group, capable of continued sequencing, under strongly basic conditions. This probably explains the low repetitive yields of Stark, supra; Meuth, J. L., et al., Biochem. 21:3750-3757 (1982) and Miller, supra, when aqueous acetohydroxamate was employed as a cleavage reagent.
Cleavage of peptidylthiohydantoins by aqueous triethylamine was originally reported by Kubo, H., et al., Chem.Pharm.Bull. 19:210-211 (1971), Dwulet, et al., supra, and Meuth, et al., supra. The latter group commented on the usefulness of triethylamine as a cleavage reagent for automated sequencing because of its volatility, but declined to pursue this method apparently in favor of cleavage by acetohydroxamate. Cleavage of peptidylthiohydantoins, in the solution phase, by a 2% aqueous solution of triethylamine was found to be rapid (half-times of 1 min. and 5 min. at 37.degree. C. and 22.degree. C., respectively) and quantitative, yielding only shortened peptide capable of continued sequencing and the amino acid thiohydantoin derivative. Bailey, et al., supra. |
Politico-moral Transactions in Indian AIDS Service: Confidentiality, Rights and New Modalities of Governance This article examines the rise of non-governmental AIDS service in India as a space of cultural politics and of possibilities for social transformation. Drawing on ethnographic material from an AIDS service NGO in an urban North Indian setting, and the network of organizations that it is part of, the article describes the emergence of a transnationally mobile community of AIDS experts, their relationship to the non-governmental and the state and the circulation of ideas and practices between the global and the local. It focuses on the politico-moral transactions around confidentiality and embedded within it, the discourse of rights to show how they reflect changing configurations of governance and citizenship, and redefinitions of health. |
Scott Engle of Park Ridge, one of the founders of Solid Rock Carpenters, along with brothers Mark and Jeff, answered some questions for the Park Ridge Herald-Advocate about the organization. The group's website states its mission of working with "community service leaders, churches and ministries assisting in building and repair projects" for those in need. For information, visit www.solidrockcarpenters.org.
Q: Why did you call it Solid Rock Carpenters?
A: My brothers and I are Christians who strongly believe in giving back to others from the blessing we have received. We sit in offices and drive desks for our day jobs and were drawn to get out of our offices and work with our hands. From experience working with Habitat for Humanity, we grew passionate about building homes for those in need. When we began to organize groups of volunteers to help with the recovery effort from Hurricane Katrina, the Bible verse from Matthew that talks about building on a solid rock foundation inspired the name.
Q: The organization started in 2005, right? Is it where you thought it would be at this point?
A: We had no idea what would become of our efforts when we began. It seemed like a natural thing to do. We organize volunteers in our work life, so doing it as an opportunity to serve others came naturally. We put into practice what we had learned from participating in trips lead by others groups over the years and were blessed to be able to partner with some wonderful local organizations that partnered us with people in need and the logistics necessary to fulfill their needs for a place to live.
Q: What's the most rewarding part of the effort?
A: The people. On our trips we get to meet the most amazing people. Many of the homeowners we build for have overcome amazing hardships and yet often have a positive outlook on life that is both humbling and inspiring. The people who come on our trips are also very special. They have a heart for serving, and that common bond makes for quick and long-lasting friendship. We have many people who plan their family vacations around our trips and have been doing so for years.
Q: What's the most challenging part of the effort?
A: Finding local organizations that are well organized and connected in their communities. After Katrina, we developed a wonderful working relationship with the Habitat for Humanity affiliate in Washington Parish, La. We currently have a great relationship with the Appalachian Service Project in Johnson City, Tenn.
A: Yes. Kids need to come with an adult. Families are encouraged to serve together.
Q: Are you able to talk with any recipients of the homes?
A: Yes. Getting to know and often work with a homeowner and their family is one of the things that makes our work so meaningful. We often return to visit with homeowners, and they come and share a meal with us when we are in their community. |
Eltrombopag Use for Treatment of Thrombocytopenia in a Patient with Chronic Liver Disease and Portal Vein Thrombosis: Case Report Off-label drug use refers to drug use beyond the specifications authorized for marketing . Eltrombopag is a thrombopoietin receptor agonist that has been used in treating thrombocytopenia due to chronic liver disease (CLD) as an off-label medication. Treatment of thrombocytopenia in patients with CLD constitute a real dilemma as the options are limited and some of them are invasive. However, thrombopoietin receptor agonist has been increasingly used for this purpose. Here we report a 34-year-old woman who has been diagnosed with CLD due to autoimmune hepatitis 20 years ago. Her condition was complicated with portal vein thrombosis, chronic thrombocytopenia, and variceal hemorrhage, and she has been listed as a candidate for liver transplantation. Given her high risk of bleeding, we started her on low dose of eltrombopag (25 mg daily) in order to maintain a platelet level of ≥50 103/L. However, 1 year after initiation of the therapy, she developed left lower limb deep vein thrombosis. Introduction Chronic liver disease (CLD) is defined as "progressive destruction of the liver parenchyma over a period greater than 6 months leading to fibrosis and cirrhosis". Thrombocytopenia is a well-known complication of CLD. It classified into moderate (less than 100 10 9 /L) and severe (less than 50 10 9 /L) thrombocytopenia. The prevalence of thrombocytopenia in patients with liver disease ranges from 6% among patients without cirrhosis to 78% in patients with cirrhosis. The pathophysiology of thrombocytopenia in these patients is multifactorial; however, the main mechanisms are platelet sequestration in the spleen and decreased production of thrombopoietin in the liver. The presence of thrombocytopenia can significantly complicate routine patient care, given that most of the patients with CLD frequently undergo medical procedures for diagnosis and treatment, some of which are invasive. In addition, it limits the use of pegylated interferon therapy for patients with liver cirrhosis due to HCV infection. Furthermore, treatment of thrombocytopenia in CLD is challenging, in that platelet transfusion can provide temporary correction of thrombocytopenia but does not ensure maintenance of hemostatic platelet levels. Several other treatment options have been used such as interventional splenic artery embolization, surgical splenectomy, and recently thrombopoietin, which regulates megakaryocyte maturation and platelet production. Currently, three oral thrombopoietin agents are available to elevate platelet counts, two of these agents (avatrombopag and lusutrombopag) were approved in 2018 by the US Food and Drug Administration (FDA) for the purpose of increasing platelet counts in patients with CLD prior to an invasive procedure. Although eltrombopag has gained approval from the FDA for chronic immune thrombocytopenia, its use is not recommended in patients with CLD due to the increased risk of venous thromboembolism. However, it has been used in several studies as an off-label medication for treatment of thrombocytopenia in CLD. In this case report, we are highlighting the challenge of treating thrombocytopenia in a patient with CLD complicated by variceal hemorrhage and portal vein thrombosis. Case Presentation We report a 34-year-old Palestinian woman who is known to have SLE and CLD due to autoimmune hepatitis, both diagnosed 20 years ago. As her CLD progressed, she developed advance liver cirrhosis with portal hypertension and massive splenomegaly. In 2012, she had upper gastro-intestinal bleeding, esophagogastroduodenoscopy (OGD) revealed esophageal varices and banding was done. One year later, she had a second episode of variceal hemorrhage. Since that time, she was under regular endoscopic surveillance with frequent banding to control varices and prevent bleeding recurrence. However, despite of supportive therapy, she continued to have recurrent hospital admissions due to decompensation. Moreover, her condition complicated by portal vein thrombosis and chronic thrombocytopenia with fluctuation of platelet count which reached a nadir of 17 10 3 /L (150 10 3 /L to 400 10 3 /L). In 2016, the patient was referred to a liver transplant clinic; at that time, her MELD score was 18, her Child-Pugh classification was B, and she was listed as a candidate for liver transplantation. Given her high risk of variceal hemorrhage in addition to the use of anticoagulation for treatment of portal vein thrombosis, and the plan for a liver transplant, it was judicious to maintain platelet level of ≥50 10 3 /L. So, the decision to start thrombopoietin was made. In 2017, the patient was started on low dose of eltrombopag 25 mg daily. The platelet count at the beginning of the treatment was 20 10 3 /L; 4 weeks later, it increased to 58 10 3 /L and it was maintained at ≥50 10 3 /L (but less than 100 10 3 /L) in the subsequent followup on the same dose. In 2018, the patient presented to the emergency department with left lower limb pain and swelling for one week. Doppler ultrasound of the left leg showed thrombosis of the distal posterior tibial vein with mild subcutaneous edema in the left calf. Platelet count at that time Mohamed/Yassin: Eltrombopag Use in Chronic Liver Disease www.karger.com/cro was 66 10 3 /L. However, although this incidence was attributed to the use of eltrombopag, after discussion of the benefits and the risks with the patient, the decision to continue eltrombopag was made. Discussion Eltrombopag is an orally bioavailable, small-molecule, thrombopoietin receptor agonist that selectively binds to thrombopoietin receptors on megakaryocyte precursors and megakaryocytes leading to increased platelet production. It has been approved for treatment of thrombocytopenia due to immune thrombocytopenic purpura. Also, it has been used in patients with chronic HCV infection to allow for the initiation and maintenance of peginterferon-based therapy. However, studies regarding its safety and efficacy in treating thrombocytopenia in CLD due to autoimmune hepatitis are still deficient. In our patient, eltrombopag has been used at low dose, yet it has been effective in maintaining platelets above the critical level. Kawaguchi et al. showed in their study that a daily dose of 25 mg of eltrombopag was effective in raising platelet count to >50 10 3 /L in Japanese patients with CLD in 2 weeks. Nevertheless, our patient needed more than 3 weeks to reach the target platelet level. This can be attributed to inter-ethnic differences in the pharmacokinetics of eltrombopag, which have been reported in previous studies. The decision to start eltrombopag in our patient who is known to have portal vein thrombosis was really challenging. However, the critically low platelet level and being on anticoagulation for treatment of portal vein thrombosis, in addition to the patient history of recurrent variceal haemorrhage, which carries high risk of mortality, and the fact that eltrombopag was the only option available were enough compelling reasons to start this therapy as it was a life-saving treatment for this patient. Nonetheless, despite using low dose of eltrombopag and maintaining the platelet level at around 50 10 3 /L to avoid risk of thrombotic events, our patient developed lower limb deep vein thrombosis one year after starting eltrombopag. One study of eltrombopag use in patients with cirrhosis and thrombocytopenia showed a positive correlation between platelet count of ≥200 10 3 /L and incidence of portal vein thrombosis. However, they used a higher dose of eltrombopag (75 mg daily) for 14 days. Up to date, there is no study conducted regarding the long-term use of eltrombopag in patients with CLD. Hence, more studies need to be carried out in order to explore this area. Conclusion Eltrombopag was effective in ameliorating thrombocytopenia in a patient with CLD due to autoimmune hepatitis; however, the risk of thrombosis still exists even with a small dose and a platelet level below 100 10 3 /L. Thus, the decision to start and to continue eltrombopag should be individualized according to the patient's condition. Statement of Ethics This case was conducted in accordance with the World Medical Associated Declaration of Helsinki. Informed Written consent was taken from the patient to publish her case. |
"Just wondering who to contact about getting the benches removed from in front of the Waypoint Bank. The undesirables are there on Saturday mornings. When you go up to the ATM, they just stare at you and make you threatened. I talked to the people in the bank and they know nothing about it. So who do we contact to get rid of the benches? It would probably make things safer."
"In just another example of government waste of our taxpayer money. Did you know that the treasury department spent $30 million on a PR campaign to inform you of the new $20 bills, just so we would all love it and would want to spend it and know what it is? Do you really think it would cost $30 million to introduce us to the new $20 bills? I could just go into the bank and when I got one in change, I would say 'Hey, look at the new $20s.' All it takes is a little advertising on the news or in the newspaper. But to spend $30 million on a campaign for a bill that we are going to have to use anyway, is a waste of money."
"Concerning the do-not-call list. I agree with this 100 percent. I don't want all those unwanted calls either, but I would like to know why the politicians think they have the right to call you. Their calls are really worse than the other calls, they are nothing but phonies. So why should they put themselves on the do-not-call list. They are the biggest pain of them all."
"I would like to know if we are going to get the GAC channel, so we can watch the Grand Ole Opry on cable?"
"I was reading in Mail Call about the seniors having to stand outside in line to get a flu shot. Get an appointment with your doctor, that's all you have to do. He will give you a flu shot."
"In reference to the seniors who have to stand outside to get a flu shot. I don't know where you are going, but everywhere I have went, we go inside where it is warm. I guess you just aren't going to the right places."
"To Terry of Keedysville. You are right on the money in defending Bush. I am really surprised more everyday at all the people who don't realize how dangerous these terrorists really are. They will stop at nothing to kill or destroy. Thanks for your great letter."
"I am calling to say that I agree with Tamara Hoffman regarding respect for our teachers and disciplinary action taken with those who choose to misbehave. I was so glad that I was born when I was, so I was taught respect for others and good moral values and I was educated by the three Rs. I am not a whiz with a computer, but I can read, add subtract, multiply and divide without a calculator or an adding machine. We weren't even allowed to use our fingers to count with."
"I think it's a shame that the bus drivers and assistants have to have their time cut. Everyone said the children need to be safe on the buses. Well, how can they if you have to speed down the road because the transportation office won't give you your right time for your route. And in the summer your time gets cut back, then your time has to be changed when you start back to school. That is a lot of time aggravation for accounting to do. Why do a manifest if you're not going to get your time? They should just pay everyone the same salary. I'm so sick of all the cutbacks and the drivers and assistants are the ones that are hurt. They only get a 2 percent raise. That is not enough raise for what we have to go through every day. I think people need to do our job for a week and the parents need to do our job, too. Kids fighting, yelling through the bus, using profanity and that is just a start. So I think people need to step back and look hard at our job."
"Would the lady who bought the metal shelf racks at the yard sale on Friday, Oct. 10, on Virginia Avenue. Please call me, I have them. My number is 301-582-1642, please call by Saturday, Oct. 18."
"I just read about the Funkstown council discussing the busy traffic and the traffic problem on Oak Ridge Drive. Talking about putting a sign where you turn on Frederick Street by the fire hall. Would it make more sense to put a left turn light there?"
"I would like to thank Helen, the nurse at Western Maryland Center, for being so good to my father and we love her very much. She is going to very missed by everyone who knows her and all the patients are going to miss her. Thanks again for being so good to my father."
"To the person inquiring about the lady bugs in her house. First of all they aren't lady bugs, they are box elder bugs, they resemble a lightning bug without the light. They are harmless, they come in every year at this time. They hang around elder maple trees. That is why they are called box elder bugs. They are looking for someplace warm, they will be gone after the first frost. Maybe within the next few weeks."
"This is someone who lives in Maugansville. Due to the nasty windy weather we have had, we lost our flag. It is orangish in color, has a bushel barrel with apples and has a bird on it. Call me at 301-797-0337 if you find it."
"Can anyone tell me when the Lions are going to have their pancake day breakfast over at the Elks? Someone told me it was this coming Saturday, but I haven't seen any ticket information on it yet."
"I am calling representing the new section of Van Lear Manor in Williamsport. We are following the town of Williamsport and our trick-or-treating will be Friday, Oct. 31 from 6 to 8 p.m. Thanks."
"The flu shot schedule for the Elks Club Lodge is Nov. 24 from 3 to 7 p.m."
"The answer to the person who wanted to know the name of the trampoline center in the Oct. 14 Mail Call on Frederick Street. It was called Bounce Land." |
Lactobacillus rhamnosus GG treatment improves intestinal permeability and modulates microbiota dysbiosis in an experimental model of sepsis Decrease of 'healthbenefiting' microbes and increase of pathogenic bacteria (a condition termed dysbiosis) in intensive care unit patients is considered to induce or aggravate sepsis (gutorigin sepsis). Orally administered probiotics have been effective in the prevention of nosocomial infections. However, the mechanisms of probioticinduced antiinfection and antisepsis remain to be explored. In the present study, 4weekold C57BL6 mice were orally administrated with Lactobacillus rhamnosus GG (LGG) or normal saline (control) 4 weeks prior to cecal ligation and puncture (CLP). A subset of the mice were sacrificed at 24 h postCLP, and the others were used for survival studies. Ileum tissues, blood and fecal samples were collected. The survival rate of septic mice pretreated with LGG was significantly improved compared with untreated mice. The levels of inflammatory cytokines were reduced in LGGpretreated septic mice. A decrease of colonic proliferation and epithelial tight junctions and an increase of colonic apoptosis were observed in control septic CLP+saline mice. LGG pretreatment reversed the colonic proliferation, apoptosis and expression of tight junction proteins to the levels of the sham group. LGG pretreatment improved the richness and diversity of intestinal microbiota in septic mice. The principal coordinates analysis clustering plots revealed a significant separate clustering in microbiota structure between three groups. Bacteria associated with energy consumption, including Bacteroidetes, with opportunistic infection, including Proteobacteria, Staphylococcaceae and Enterococcaceae, lipopolysaccharide producers, including Enterobacteriaceae, and facultative anaerobes, such as Bacteroidaceae and Erysipelotrichaceae, increased in septic mice. By contrast, bacteria associated with energy harvest, including Firmicutes, intestinal barrier function regulators, including Akkermansia, hepatic function regulators, including Coprococcus and Oscillospira, and obligate anaerobes, including Prevotellaceae, decreased in septic mice. With LGG pretreatment, the sepsisinduced microbiota dysbiosis was reversed. The present results elucidated the potential mechanism of LGG treatment in sepsis, by improving intestinal permeability and modulating microbiota dysbiosis. Introduction Sepsis is one of the leading causes of mortality and morbidity in children and adults. An altered gut microflora is always present in patients with sepsis, which induces infective and non-infective complications. In addition, intestinal epithelium dysfunction and dysbiosis, induced by critical illnesses, result in increased translocation of bacteria to the blood, which contributes to the adverse outcome of sepsis. Probiotics are regarded as the living microorganisms, which, in adequate amounts, can bring health benefits to the host. Among them, the genera Lactobacillus and Bifidobacterium are the most widely used. Up to date, probiotics have been increasingly applied and studied in clinical practice. It is believed that probiotics can reduce the risk of disease (including diarrhea, allergic diseases and inflammatory bowel disease) through competition for binding locus and nutrients with pathogens, producing bacteriocins to kill pathogens, synthesizing IgA to modify immune responses and reducing inflammation by stimulating regulatory lymphocytes through interleukin (IL)-10 and transforming growth factor signaling. However, application of probiotics on sepsis has been limited due to the theoretical risk of aggravating bacteremia in patients with critical illnesses, although few data exist that support this concern. A previous study from our group has previously reported that prophylactic administration of a special probiotic bacterial species in a septic mouse model effectively reduced mortality. However, the underlying mechanisms by which probiotics alleviated the severity of sepsis remains unclear. In order to get a better understanding of the role that the gut microbiota serve in the process of sepsis, the present study performed a comprehensive analysis of the microbiota alterations during sepsis. In the past, the colonization patterns in septic patients were investigated predominantly from culture-dependent studies. However, most of the intestinal bacteria are obligate or facultative anaerobic bacteria, which are technically challenging to culture once exposed in oxygen. Therefore, previous studies merely confirm the absence or presence of specific bacteria, without providing a global view of microbiota alterations during sepsis. due to the development of bacterial 16S ribosomal dNA gene sequencing, the present study was able to decipher microbial diversity and reveal the alterations of the gut microbiota structure during sepsis. The present study aimed to investigate the effects of probiotic LGG on the microbial diversity in septic mice, and to examine how this contributes to the prevention and reversion of sepsis. Materials and methods Ethics statement. All procedures for animal care and use were approved by the Animal care Ethics committee of the First Affiliated Hospital, Zhejiang University (Hangzhou, china). Four-week-old male c57BL6 mice were purchased from Zhejiang University and housed in pathogen-free animal facilities under a standard 12-h light/dark cycle. Standard mouse diet and water were given throughout the study. Probiotic administration and cecal ligation and puncture (CLP) model. Four-week-old male c57BL6 mice (weight, 13.59±1.59 g) were administered daily by oral gavage with 200 l of LGG (2x10 9 cFU/ml, 2.9x10 7 cFU/g; culturelle, conAgra Foods, Omaha, NE, USA; cLP+LGG group, n=23), or normal saline (control sham group, n=20; cLP+saline group, n=18) 4 weeks prior to cLP. To establish the murine septic peritonitis model, the mice were anesthetized with isoflurane and bupivacaine: 3-4% for induction and 1-3% for maintenance. A 1 cm incision was made on the middle of abdomen, and the cecum was exteriorized through the incision carefully. In order to induce mid-grade sepsis, the cecum was ligated at middle of the bottom and distal pole of the cecum, and was punctured from mesenteric toward anti-mesenteric direction using a 23-gauge needle. A droplet of feces was extruded through the holes, and the cecum was relocated into the abdominal cavity. Finally, the fascia and skin incision were closed in layers. Sham mice underwent the same operation except from the cecum ligation and puncture procedures. In all mice, 1 ml pre-warmed normal saline was injected subcutaneously following the cLP operation. Serum sample preparation. Blood samples were obtained from mice at 24 h post-cLP operation. In all instances, 1.5 ml of blood was introduced in the tube not containing any anticoagulant substance. The collected blood was left on the laboratory bench for 30 min at room temperature prior to centrifugation (3,000 x g, 5 min, 4C), then the serum was carefully absorbed and transferred to a clean tube, and then stored at -80C as soon as possible. The plate was incubated for 1 h at 37C and washed five times. HRP-substrate goat anti-mouse IgG (cat. no. 31430; 1:100; Invitrogen; Thermo Fisher Scientific, Inc., Waltham, MA, USA) was then added and incubated for 15 min at 37C in the dark. The enzyme reaction was then stopped with 50 l stop solution and absorbance was measured at 450 nm. Concentrations of inflammatory factors were calculated using a standard curve generated by standard solution in different concentration. Serum levels of inflammatory factors are presented as pg/ml. Histological analysis. colon tissues were collected from mice at 24 h post-CLP operation and fixed in 10% formalin for 24 h. The fixed colon tissues were embedded in paraffin and sectioned at 4 m thickness. The sections were stained with hematoxylin-eosin (H&E) for histological observation to examine the pathology of colon injury induced by cLP. Immunohistochemistry staining for Ki67 and occludin were performed for measuring proliferation and tight junction formation between intestinal epithelial cells. For immunohistochemistry staining, the standard protocol was performed. Briefly, sections were incubated with 3% H 2 O 2 5 min at room temperature to eliminate the endogenous peroxidase activity, and then blocked with 5% bovine serum albumin for 1 h at 37C and then incubated with rabbit anti-Ki67 polyclonal antibody (cat. no. PA5-19462; 1:100; Invitrogen; Thermo Fisher Scientific, Inc.) or rabbit anti-occludin polyclonal antibody (cat. no. 404700; 1:100; Invitrogen; Thermo Fisher Scientific, Inc.) at 4C overnight. This was followed by washing three times with PBS for 5 min each. The sections were incubated with horseradish peroxidase-conjugated mouse-anti-rabbit IgG secondary antibody (cat. no. 31464; 1:100; Invitrogen; Thermo Fisher Scientific, Inc.) for 1 h at 37C. Colour was developed with a 3,3'-diaminobenzidine kit (Nanjing KeyGen Biotech co., Ltd., Nanjing, china) and microscopically examined with light microscope (Motic BA210; Motic Instruments, Richmond, Bc, canada) using the following calculation: Ki67-positive cells/100 crypts or Occludin-positive cells, in five random x100 magnified fields, five mice per group, one section per mouse. Terminal deoxynucleotidyl-transferase-mediated dUTP-biotin nick end-labeling (TUNEL) assay (Invitrogen; Thermo Fisher Scientific, Inc.) was used for evaluation of apoptosis in the cLP-treated colon, according to the manufacturer's protocol (TUNEL-positive cells/100 crypts, in five random x100 magnified fields, five mice per group, one section per mouse). DNA extraction and polymerase chain reaction (PCR) amplification. To ensure the abundance of LGG (normal/saline, n=5; normal/LGG, n=5), dNA was extracted from fecal samples obtained from a separate set of mice prior to cLP operation. Quantification of LGG was performed by PcR using the following primers: LactoF, 5'-AGc AGT AGG GAA TcT Tcc A; and LactoR, 5'-ATTYcAccGcTAcAcATG. The PcR reaction was performed as follows: 95C for 5 min, followed by 35 cycles at 94C for 15 sec, 53C for 30 sec, and 72C for 45 sec and a final extension at 72C for 10 min. To analyze the microbial diversity in mice with sepsis, fecal samples were collected from mice at 24 h post-CLP operation and frozen at -80C immediately. Microbial dNA was extracted from fecal samples using a Qiagen mini kit (Qiagen GmbH, Hilden, Germany) according to the manufacturer's instructions. The V3-V4 region of the bacteria 16S ribosomal RNA gene was PCR-amplified using primers 341F (5'-ccT AYG GGR BGc ASc AG-3') and 806R (5'-GGA cTA cNN GGG TAT cTA AT-3'). An eight bp unique barcode sequence was attached to each sample. A total of 50 l PcR reactions contained 0.5 l of dNTPs (10 mmol/l), 0.5 l each primer (5 mol/l), 0.5 l Fast-Pfu polymerase (Beijing Transgen Biotech co., Ltd., Beijing, china), 5 l 10X Fast-Pfu buffer, and 10 ng template dNA. The PcR reaction was performed as follows: 98C for 30 sec, followed by 15 cycles at 98C for 10 sec, 65C for 30 sec, and 72C for 30 sec and a final extension at 72C for 5 min. PCR products were pooled, purified using the AxyPrep dNA gel extraction kit (Axygen Biosciences, Union city, cA, USA) according to the manufacturer's protocol, and quantified using a fluorometric kit (Quant-iT PicoGreen; Invitrogen; Thermo Fisher Scientific, Inc.). Purified products were pooled in equimolar for library construction. An Illumina miseq platform (Illumina Inc., San diego, cA, USA) was applied to construct the paired-end sequencing (2x250 bp) and sequencing libraries according to the standard protocols. Sequence analysis. Based on barcode and PcR primer, all raw reads were screened using the Quantitative Insights Into Microbial Ecology software (QIIME, version 1.17). The 250 bp sequences were trimmed at the site where a quality score <20 over a 10 bp sliding window was received, and sequences were omitted according to the following criteria: 1) containing ambiguous sequences, 2) The maximum nucleotide mismatches in primer matching was two, 3) Sequences shorter than 150 bp. UcHIME was used to identify and remove chimeras. USEARcH was applied to generate operational taxonomic units (OTUs), reads with the maximum length in each OTU were selected as representative sequences. Based on the bacterial SILVA dataset, representative sequences were assigned to different taxonomic levels. The richness and diversity of gut bacteria were evaluated using indexes chao1; simpson and shannon respective. Principal coordinates analysis (PcoA) was conducted on the base of weighted and unweighted uniFrac distance metrics analysis to assess the interactions among bacterial communities of different samples. Ribosomal database project classifier (http://rdp.cme. msu.edu/) was applied to annotate the sequences. Statistical analysis. The homogeneity of variances was verified using Bartlett's test. One-way analysis of variance (with Bonferroni as a post hoc test) was performed to compare the bacteremia, inflammatory factors and histological study variables among different groups. Survival studies were conducted via the log-rank test. Two-tailed non-parametric Kruskal-Wallis test was used to compare the differences in diversity indexes and microbial taxa. The statistical analysis was conducted using GraphPad Prism 5 (GraphPad Software, Inc., La Jolla, cA, USA). P<0.05 was considered to indicate a statistically significant difference. Results LGG pretreatment reduces mortalit y and systemic inflammatory response in septic peritonitis. dNA isolated from the colon was analyzed by PcR to quantify the abundance of LGG. There was a significant increase in LGG in normal/LGG mice (9.24±0.06, log qPcR copy/fecal (g); n=5) compared with normal/saline mice (8.16±0.11, log qPcR copy/fecal (g); n=5; Table I). These data indicated that LGG was able to survive and propagate in the gastrointestinal tract following oral administration for 4 weeks. A previous study demonstrated that septic mice pretreated with LGG had a markedly improved 7 day survival rate compared with the cLP+saline mice (P=0.029). Five sham mice survived. To evaluate the effect of LGG on reducing the inflammatory response caused by sepsis, the levels of proinflammatory cytokines IL-6, TNF-, IL-2, and of the anti-inflammatory cytokine IL-22, were detected by ELISA assay in the serum of the mice. Serum levels of IL-22, IL-2, TNF- and IL-6 were the lowest in sham mice (Fig. 1). IL-22 and IL-2 levels were the highest in cLP+saline mice (Fig. 1). In the LGG-pretreated mice, IL-22 and IL-2 levels remained elevated in the septic mice compared with the sham mice (P<0.05; Fig. 1), but were decreased compared with the cLP+saline mice (P<0.05; Fig. 1). TNF- and IL-6 levels were higher in septic mice compared with sham mice (P<0.001; Fig. 1), regardless of LGG pretreatment or not. Quantification data from PCR analysis are expressed as log qPCR copy/fecal (g). data are expressed as mean ± standard deviation. LGG, Lactobacillus rhamnosus GG; PcR, polymerase chain reaction. LGG pretreatment alleviates injury in colon mucosa. As evaluated by H&E staining and histopathological analysis, colon tissues in sham mice were histologically normal in all layers ( Fig. 2A). In cLP+saline mice, the epithelial structure was almost entirely corrupted and disintegrated, accompanied with gland deformation and infiltration of inflammatory cells ( Fig. 2A). However, with LGG pretreatment, the epithelial appearance was regular, with less inflammatory cell infiltration in septic mice ( Fig. 2A). Occludin staining revealed a significant decrease in tight junction formation in the colonic tissue of both cLP+saline and cLP+LGG mice, compared with the sham group (Fig. 2B). However, the level of Occludin was much higher in cLP+LGG mice compared with the cLP+saline mice (P<0.001; Fig. 2B and E). colon mucosal injury was obvious in septic mice. The damage in septic mice pretreated with LGG was less obvious compared with the CLP+saline mice. Significant differences existed in mucosal morphology and tight junction formation among cLP+LGG mice, cLP+saline mice and sham mice. LGG pretreatment normalizes cell proliferation and apoptosis in colon. In the present study, colon mucosal injury was obvious in septic mice, but the injury was alleviated by LGG pretreatment. Therefore, cell proliferation marker and apoptosis of the colonic epithelium were examined next, by Ki67 staining (Fig. 2c) and TUNEL assay (Fig. 2d), Fig. 2E). These results suggested that LGG pretreatment promoted the proliferation and decreased the apoptosis of colon tissue in septic mice. LGG pretreatment increases microbial diversity in mice with sepsis. The coverage index was >97% per sample, which means the 16SrRNA sequences detected in the present study were sufficient to represent the majority of the bacteria present in the samples. The OTUs, richness index (chao 1) and diversity index (Shannon and Simpson) were decreased in control septic mice compared with sham mice (Table II). With LGG pretreatment, all the four indexes increased in septic mice (Table II). Although no significant difference was observed in the four indexes, the richness and diversity of bacteria in the cLP+LGG mice was much closer to that of sham mice compared with cLP+saline mice (Table II). The result indicated that sepsis decreased microbial richness and diversity, and the trend could be reversed by LGG treatment. Principal coordinates analysis among three groups. The weighted unifrac PcoA plot was used to compare the similarities of microbiota structure among three groups. The PcoA plot revealed a significantly separate clustering in microbiota structure among the three groups. The distance between cLP+LGG and sham mice was closer compared with the distance between cLP+saline and sham mice (Pc1 and Pc2 were 0.242 and 0.107, respectively; Fig. 3). Bacterial taxonomic differences between septic mice with different treatment. The microbial profiles of the experimental mouse groups represented 10 phyla, among which, Bacteroide, Firmicutes and Verrucomicrobia were the dominant phyla in sham mice, while Bacteroide, Proteobactria and Firmicutes composed the majority of phyla in both control cLP+saline and cLP+LGG mice (Fig. 4A and B). compared with sham mice, the level of Firmicutes was lower, Proteobactria (P<0.01) and deferribacteres (P<0.05) were higher, and the ratio of Bacteroidetes/Firmicutes (B/F) increased in cLP+saline mice. With LGG pretreatment, both bacteria composition and B/F ratio in septic mice reversed to the level of sham mice (P>0.05; Fig. 4c and d). The majority of OTUs that were detected among the three groups of mouse microbiomes were assigned to 28 LGG pretreatment, no significant difference was observed between septic and sham mice in most of the bacteria (Fig. 4G and H). Linear discriminant analysis (LDA) effect size (LEfSe) reveals microbiota structure of the three groups and their predominant bacteria. LEfSe was used to compare the microbiota phylotypes among the three groups. The sham microbiome had a huge preponderance of Verrucomicrobia, Akkermansia, S24-7, F16 and TM_7, whereas the cLP+saline microbiome had a preponderance of Bacteroide, Enterobacteriales, Enterococcaceae, deferribacteres, Pseudomonales and Erysipelotrichi (Fig. 5). Lactobacillales, Pseudomonales and Erysipelotrichi were enriched in cLP+LGG mice (Fig. 6). compared with control septic mice, deferribacteres disappeared, and Verrucomicrobia and Akkermansia appeared in septic mice treated with LGG (Fig. 7). These results suggested that sepsis-induced microbiota dysbiosis can be reversed by LGG pretreatment. Discussion critical illness and its treatment create a hostile environment in the gastrointestinal tract by altering the microbiota. Altered mucosal oxygen gradient and increased nitrate production caused by critical illness favor the growth and invasion of opportunistic pathogens, such as Pseudomonas and Escherichia coli in Proteobacteria phylum, and Staphylococcus and Enterococcus in Firmicutes phylum, resulting in the release of cytokines, cell apoptosis and corruption of epithelial tight junctions. With loss of intestinal barrier function, the gut is unable to prevent the translocation of pathogens and toxins into the blood and extraintestinal organs, leading to or aggravating sepsis and mortality. The present study demonstrated that, after the onset of sepsis, there was an appearance of opportunistic pathogens Staphylococcaceae and Enterococcaceae, and a disappearance of beneficial Prevotellaceae. High relative abundance of potentially pathogenic commensals, such as Enterobacteriaceae, Bacteroidaceae, Erysipelotrichaceae, deferribacteraceae, clostridiaceae and Pseudomonadaceae, was associated with more severe immune responses during sepsis, demonstrated by higher serum levels of proinflammatory cytokines (IL-22, IL-2, TNF- and IL-6), epithelial cell apoptosis and disruption of tight junctions. With LGG pretreatment, opportunistic pathogens decreased or even disappeared, while beneficial bacteria, such as Verrucomicrobiaceae, increased, epithelial cell apoptosis was inhibited, and proliferation and cell tight junction formation was promoted. The present results suggest that prophylactic LGG therapy could be effective in reducing mortality from sepsis via the normalization of altered gut flora, inhibiting systemic inflammation and maintaining the mucosal barrier function. Bacteroidetes and Firmicutes are the two dominant phylum in the human and mouse microbiome. changes in the relative abundance of Bacteroidetes and Firmicutes have been determined to affect energy balance. Firmicutes was related with energy harvest and storage, while Bacteroidetes has a capacity to consume energy. The present results revealed a higher B/F ratio in cLP+saline mice compared with sham mice. With LGG treatment, the B/F ratio decreased. One potential explanation for this may be that the high degree of systemic inflammation caused by sepsis is a high energy consuming process, and therefore, during this process, both energy harvest and storage decreased. As a result, Firmicutes decreased and Bacteroidetes increased. Prophylactic LGG therapy partly reversed the B/F ratio and therefore rebalanced the energy intake and expenditure, and reduced mortality in sepsis. Vollaard et al reported the concept of colonization resistance (cR). In this concept, the anaerobic microbiome limits the concentration of opportunistic pathogenic (mostly aerobic) bacteria in the gastrointestinal tract. Analysis of fecal microbiome revealed that total anaerobic bacteria counts decreased notably in patients with severe systemic inflammatory LEfSe and LdA analysis based on OTUs characterize the microbiomes between the cLP+saline mice and cLP+LGG mice. cladogram using LEfSe method indicating the phylogenetic distribution of fecal microbes associated with cLP+LGG (green) and cLP+saline mice (red). LdA scores showed the significant bacterial difference between the CLP+LGG and CLP+saline mice. CLP+saline mice, n=8; CLP+LGG mice, n=8. LEfSe, linear discriminant analysis effect size; LdA, linear discriminant analysis; OUT, operational taxonomic unit; cLP, cecal ligation and puncture; LGG, Lactobacillus rhamnosus GG. response syndrome. Research linking gut translocation of bacteria to the development of postoperative sepsis indicated that Gram-negative facultative anaerobic bacteria were the dominant microflora in patients with postoperative sepsis, whereas obligate anaerobes counts decreased, suggesting that the imbalance between obligate anaerobes and facultative anaerobes is closely related to the incidence of infectious complications in critically ill patients. In the present study, facultative anaerobic bacteria, including Enterobacteriaceae, Bacteroidaceae and Erysipelotrichaceae, increased significantly and obligate anaerobes, such as Prevotellaceae, disappeared in control septic mice. Prophylactic LGG therapy rebalanced the ratio of obligate anaerobes and facultative anaerobes, and therefore enhanced the ability of colonization resistance in the host. Evidence suggests that several specific members of microbiota serve important roles in rebalancing dysbiosis and preventing disease. For example, coprococcus produces butyrate to provide an energy source for epithelial cells and induces the differentiation of colonic regulatory T cells to suppress inflammatory and allergic responses. Akkermansia has been described to have an anti-inflammatory property. It has been reported that Akkermansia administration could raise the levels of endocannabinoids in intestinal cells that control the gut peptide secretion, gut barrier, and inflammation. In addition, Akkermansia, coprococcus, Lactobacillus, Oscillibacter have been correlated negatively with lipopolysaccharide (LSP) in feces and hepatic function features (including plasma glucose and total lipids), whereas the LPS-producer Bacteroides were positively correlated with fecal LPS and hepatic function in mice with liver diseases. In the present study, Akkermansia, coprococcus, Sutterella, Oscillospira, Lactobacillus and desulfovibrio were decreased and Bacteroides were increased in control septic mice compared with sham mice at the genus level. With LGG treatment, no significant difference was observed between septic and sham mice, indicating that LGG may increase the gut barrier function, decrease the levels of LPS, reduce inflammation and improve hepatic function through increasing Akkermansia, coprococcus, Lactobacillus, Oscillibacter and decreasing Bacteroides. In summary, the present study demonstrated that prophylactic LGG therapy reduced mortality through attenuating inflammatory responses, and increasing gut barrier integrity and function. LGG pretreatment increased the diversity of intestinal microbiota and expression of beneficial bacterium. Among them, Prevotellaceae, Lactobacillaceae, Staphylococcaceae, Enterococcaceae, Enterobacteriaceae, Bacteroidaceae, deferribacteraceae can be regarded as the key bacteria in sepsis treatment. Prevotellaceae may be exploited as a potential probiotic in sepsis treatment. |
Share this article! 14 Pinterest 0 StumbleUpon 0 Reddit 0 Linkedin Tumblr 0 Digg
Thinking back to my opinion of natural hair back in 2005, many women have had the same or very similar “back to natural” hair journey as I. See, I’ve always liked the look and texture of coily, kinky, fuzzy hair as far back as I can remember. However, like most Black women growing up when I did, I didn’t like that natural look on my head. I can only deduce that for me, it was a conditioning of my mind born out of the quiet, unspoken desperation of the black women living around me. Straight hair was the inevitable initiation into the world of maintaining or hiding my shame. Permanently straight or wavy hair was also necessary to save my mother’s sanity. She had to keep up with hairstyling on the heads of three girls as well as her own hair care.
So, my decision to stop relaxing and go natural wasn’t because I had suddenly discovered that I like natural hair and wanted to free myself of relaxers. I had already known natural hair was an option and that it could look nice. The reason I decided to stop relaxing my hair was that my hair density was thinning out. My scalp was showing in places and no joy from a “fresh perm” could cover for the despair I felt watching my thick head of hair dwindle over the years.
Knowing it was from years of lye burns, brushing, and pregnancy, I decided to stop relaxing my hair in December of 2005. It was the first step in my solution to reduce or stop the progression of my alopecia. My appreciation for natural growing hair helped me in the coming months to endure the endless criticisms and questions from family and coworkers. Plus, I couldn’t go back to straightening with relaxers because I needed the kinks of my hair to fill in my scalp!
In the beginning I didn’t know much about hair health or styling most of the time, so many days I went to work with my hair looking unkempt. Thankfully, my career didn’t suffer because of it. In fact, the raises and merit pay kept coming and my confidence was rising. I soon learned more information about hair care from the internet forums and videos. My journey has benefited my daughters, also. They love their “puffy” hair and their hair is healthy and growing. My family and friends have all pretty much accepted my natural hair. I can count seven of my close family members that have started natural journeys of their own citing my perseverance and confidence as inspiration.
I’ve certainly made tougher and more serious decisions in my life that wearing my hair natural. However, going natural is among the choices I’ve made that have brought about so many more positive changes in my life.
Guest Author: Lisa Ingram
Lisa Ingram is a former company data analyst now freelance virtual assistant and stay at home mom to three wonderful children. Contact her at ingr96centurylink@outlook.com
Do you have a story to share about your journey, natural hair care strategies, or your thoughts about the societal aspects of being a black woman who chooses to wear her hair natural? Join Lisa and submit a guest post and we’ll feature you on our blog.
if (function_exists('nrelate_related')) nrelate_related(); ? |
Economic Burden and Medical Insurance Impact of the Different Dialysis for End-stage Renal Diseases. Background Dialysis costs was a heavy burden in End Stage Renal Disease (ESRD) patients. In China, the two major medical insurance systems are the New Cooperative Medical Scheme (NCMS) for rural residents and the Urban Employees' Medical Insurance (UEMI) for urban patients. This study compared the economic burden of ESRD patients under different dialysis methods and the impact of the medical insurance system on it. Methods Overall, 156 ESRD patients were enrolled at the Department of Nephrology in the First Affiliated Hospital of Zhengzhou University, Zhengzhou, China between Jan 2013 and Jan 2014. They were divided into hemodialysis group (HD group, n=84) and peritoneal dialysis group (PD group, n=72). The data, such as the patient's basic information, total expenses and self-paid expenses in the early stage of dialysis and 1-year treatment, and medical insurance type, were separately collected. Results The early-stage average total expenses and self-paid expenses in the PD group were higher than those in the HD group (P<0.01). The average total expenses and self-paid expenses in the PD group were lower than those in the HD group (P<0.01). Whichever dialysis method was used, the self-paid expense percentage for the NCMS patients and was higher than UEMI patients. Conclusion In terms of the long-term dialysis treatment for ESRD patients, the better choice was PD judging by the treatment expenses. Meanwhile, different medical insurance types had significant economic burden impacts on dialysis patients. Introduction End Stage Renal Disease (ESRD) is one of the serious diseases that threaten human health in the world, and its incidence shows an increasing trend year by year. According to the renal disease data system in the US, the ESRD patients increase at the rate of 1.9%-2.3%. The prevalence rate of Chronic Kidney Disease (CKD) in Mainland China is up to 10.8%. Of the 120 million CKD patients, about 2% develop into ESRD patients. About 2-3 million people need to receive the long-term renal replacement therapy, namely hemodialysis, peritoneal dialysis and renal transplantation, etc., in order to ensure the basic life quality. Specifically, hemodialysis and peritoneal dialysis are the major treatment methods for ESRD in China. However, with the rapid growth of the dialysis population in recent years, the dialysis costs have kept increasing in various countries, and the ESRD patients have become a heavy burden on society and families. With the constant improvement of the medical insurance system in China, the economic burden of dialysis patients was relieved to vary degrees. But which dialysis method has better cost benefits and can reduce the payment pressure of the medical healthcare insurance fund while relieving the burden on patients has become a hot and difficult issue for the professional fields and medical insurance authorities in China. This study chose the regular peritoneal dialysis patients and hemodialysis patients of the First Affiliated Hospital of Zhengzhou University, collected such data as patients' medical insurance type, differences of total expenses in the early stage of dialysis, 1-year total treatment expenses, and self-paid expenses, compared and analyzed the differences between economic burden of different dialysis methods and different medical insurance policies on their compensation amount, and provided empirical basis for relieving the economic burden of dialysis treatment on ESRD patients, for choosing the dialysis method and for perfecting the medical insurance policy. Materials and Methods The ESRD patients who received peritoneal dialysis and hemodialysis at the First Affiliated Hospital of Zhengzhou University for the first time in Henan Province from Jan 2013 to Jan 2014 were chosen. They were filtered and chosen one by one according to the inclusion and exclusion criteria, and 156 patients were chosen. Specifically, there were 72 patients in the peritoneal dialysis group, and 84 patients in the hemodialysis group, including 91 males and 65 females. Those patients who met the inclusion and exclusion conditions were contacted by phone and followed up face to face, and were informed of the study issues and the requirements to cooperate with the follow-up, and their opinions were sought for. The questionnaire survey was conducted to collect data. This study was approved by the Ethical Committee of the First Affiliated Hospital of Zhengzhou University and all the participants provided a written informed consent. We defined the inclusion criteria as follows: 1) Aged 18-75; 2) Compliance with the ESRD diagnosis according to the 2012 KDIGO guide; 3) They must be the patients who establish the PD or HD channels at our hospital for the first time; 4) They must have received regular dialysis (namely HD 3 times/week, PD CAPD) for at least 1 year, and have been followed up on a regular basis; 5) They must have a clear mind, normal memory, have no serious complications concerning their hearts, lungs, brains, and livers, etc., and agree to cooperate in this survey. The exclusion criteria as follows: 1) Aged <18 or >75; 2) Dialysis time <1 year or dialysis is not regular; 3) People with mental disorders, disturbance of consciousness, or abnormal communications or serious complications, who cannot cooperate in the survey; The collection of data on general conditions includes the name, gender, age, occupation, medical Insurance type, dialysis way, total cost of dialysis in the early stage, total cost of 1-year treatment, etc. The pre-dialysis costs refer to the costs incurred in the preparatory stage of dialysis, which mainly included the diagnosis fee, treatment fee, medicine fees, general medical service fee, etc. The 1-year total costs mainly include the dialysis treatment cost, complication drug cost, and re-examination cost. The medical insurance types include the New Cooperative Medical Scheme (NCMS) for rural residents and the Urban Employees' Medical Insurance (UEMI) for urban patients. For data collection, the retrospective analysis method was used. Overall, 156 patients who received PD and HD at the First Affiliated Hospital of Zhengzhou University for the first time in Henan Province from Jan 2013 to Jan 2014 were chosen, including 72 patients in the PD group and 84 patients in the HD group. All the data should be input by double persons in double copies using Excel 2007 to establish the database for data management. For statistical analysis, the SPSS 19.0 software (Chicago, IL, USA) was used for data analysis. The measurement data were expressed with mean ± standard deviation. The abnormal distribution data were turned into normal distribution data via logarithmic transformation, then were analyzed. The comparison between the two groups employed the t-test. The differences had statistical significance if P<0.05. Medical insurance type for dialysis patients Of the subjects of this study, 89 people (accounting for 57.05%) participated in the NCMS, 24.36% were covered by the UEMI ( Table 2). Costs of different dialysis methods Total costs before dialysis The statistical result of pre-dialysis costs showed that the total cost of the PD group was about $5,810, and the total cost of the HD group was about $3,470. The total cost of the PD group before dialysis was higher than that of the HD group (P<0.01 in both cases), which had statistical significance (Table 3). Total cost for 1-year dialysis The statistical result of the 1-year dialysis showed that the total cost of the PD group was $14,560, and the total cost of the HD group was $18250. The total cost of the PD group was lower than that of the HD group (P<0.01 in both cases), which had statistical significance (Table 4). Patient expenses under different medical insurance policies Expenses under different medical insurance policies for patients in PD group Comparison of the expenses for PD patients under different medical insurance policies showed that for the NCMS patients, the early-stage total expenses and self-paid expenses were $5,990 and $3,520 respectively, the 1-year total expenses was $13,580, the self-paid expenses were $5,330, all lower than those of UEMI patients, but the selfpaid expense percentage for the NCMS patients was higher than UEMI patients (Table 5). Expenses under different medical insurance policies for patients in HD group In terms of the total expenses for HD patients, the NCMS patients had lower expense levels than UEMI patients in terms of both early-stage total expenses and 1-year total expenses. However, judging by the self-paid expenses, the self-paid expense percentage for the NCMS patients was higher than that of UEMI patients (Table 6). Discussion Hemodialysis and peritoneal dialysis are the commonly used kidney replacement methods for ESRD patients. Hemodialysis can effectively clear metabolites and excessive moisture in the body, maintain electrolytes and acid-base balance and is the first choice for most of the ESRD patients. Peritoneal dialysis can protect the residual renal function and is portable and has been widely used in recent years. The patients using PD have better perception function and quality of life than the patients using HD. In Hong Kong, China, the ESRD patients tend to choose PD, accounting for 72.9% of the total dialysis population. However, whichever dialysis method requires massive manpower and material resources and causes a heavy burden on a country and individual. The ESRD patients are mainly aged 30-60 and are the major labor of their families and society, as well as the important factors that affect the social and economic development ( Table 1). Concerning the dialysis cost, the cost of the PD group patients before dialysis is higher than that of the HD group patients by about $2,260 ( Table 3). The major possible causes include 1) Different ways of anesthesia: The peritoneal dialysis operation is a general anaesthesia operation, and the hemodialysis channel establishment is a local anesthesia operation. The cost for general anaesthesia operation is about $205, while the cost for local anesthesia is only $13. 2) Different operation costs: The cost of the traditional laparotomy PD operation is about $161, the cost of laparoscopic PD catheterization is about $291, the cost of establishment of the HD long-term pipe is $969, the cost of arteriovenous fistula plasty is $242, which shows that the PD operation cost is less than that of HD (The above costs are based on the charging standard of the First Affiliated Hospital of Zhengzhou University). 3) Different preoperative preparations: PD has high requirements for a patient's conditions, and the cost of preoperative preparations is high. 4) Different length of hospitalization: The PD group patients need to receive formal training seven days after pipe establishment. The HD group patients may be discharged from hospital after 1 day of observation after channel establishment. Therefore, the PD group patients are hospitalized longer than the HD group patients and require higher costs. Table 4 shows that in terms of the total treatment costs for the patients of the two groups 1 year later, the PD group is about $14560, the HD group is about $18250 and is equivalent to the cost as shown by the national survey. The possible causes may include 1) Different dialysis costs: At municipal-level hospitals, the cost of hemodialysis each time is about $57-80, while for the peritoneal dialysis patients, the cost of each bag of dialysate is about 6$. In the case of regular and full dialysis, the cost of the HD patients is certainly higher than that of the PD patients. 2) Different numbers of hospitalizations: PD may be conducted at home after channel establishment, while HD must be conducted at hospitals 3 times every week on average. Therefore, PD may save transport and hospitalization costs, etc. 3) Drugs: For HD patients, blood drainage out of the body may result in unstable hemodynamics and chronic blood loss, but for PD, this situation does not occur. Therefore, PD patients use fewer drugs such as antihypertensive drugs and erythropoietin. Therefore, judging by the 1-year treatment cost, the PD group families have a lighter economic burden than the HD group families. Tables 5 and 6 show that judging by the expense levels, the total dialysis expenses and the self-paid expense for the NCMS patients were lower than UEMI patients, but the self-paid expense percentage for the NCMS patients was higher. This is probably because the basic income of the rural residents in Henan Province is far lower than that of urban residents, some patients might stop using the auxiliary drugs, such as drugs for anemia correction, electrolyte disturbance correction, and blood pressure control, or bought cheaper drugs, or failed to follow the doctor's advice and pay return visits, which made the expenses of such patients low. However, high medical expenses still resulted in a large number of "illness-related poverty" phenomena. Judging by the reimbursement proportion, the self-paid expense percentage for the NCMS patients was about 50%-60% and was higher than that of other medical insurance patients. The reasons may include: 1) The reimbursement proportion for the NCMS patients as defined by the national medical insurance policy is obviously lower than that defined by the provincial, and municipal and employees' medical insurance, which is unfair to some extent. 2) Restriction of the hierarchical diagnosis and treatment policy: As the ESRD patients have many serious complications, some of them need to be transferred to provincial hospitals for treatment, but the hierarchical diagnosis and treatment system stipulates that skip-level treatment will decrease a patient's reimbursement proportion. Conclusion Though the PD cost in the preparatory stage before dialysis is higher than the HD cost for ESRD patients, the 1-year treatment cost is obviously lower in the PD group than that in the HD group. By comparison, the PD patients have lighter family financial burden than the HD patients. Medical insurance organizations increase the reimbursement proportion, formulate the reimbursement policy which drives patients to voluntarily choose PD, so as to further ease patients' economic burden and increase ESRD patients' life quality. Meanwhile further costeffectiveness analysis of peritoneal dialysis and hemodialysis be carried out to provide decisionmakers with more decision basis and suggestions. Ethical considerations Ethical issues (Including plagiarism, informed consent, misconduct, data fabrication and/or falsification, double publication and/or submission, redundancy, etc.) have been completely observed by the authors. |
// Size returns the total size of an image's packed resources.
func (image *Image) Size(ctx context.Context, provider content.Provider) (int64, error) {
var size int64
return size, Walk(ctx, Handlers(HandlerFunc(func(ctx context.Context, desc ocispec.Descriptor) ([]ocispec.Descriptor, error) {
if desc.Size < 0 {
return nil, errors.Errorf("invalid size %v in %v (%v)", desc.Size, desc.Digest, desc.MediaType)
}
size += desc.Size
return nil, nil
}), ChildrenHandler(provider)), image.Target)
} |
/**
* Adds a new line to the VCFHeader. If there is an existing header line of the
* same type with the same key, the new line is not added and the existing line
* is preserved.
*
* @param headerLine header line to attempt to add
*/
public void addMetaDataLine(final VCFHeaderLine headerLine) {
if ( addMetadataLineLookupEntry(headerLine) ) {
mMetaData.add(headerLine);
checkForDeprecatedGenotypeLikelihoodsKey();
}
} |
Walmart-backed Flipkart is expanding the furniture category on its platform with the introduction of new sub-brand 'Pure Wood' as it looks to compete aggressively against not just its arch-rival Amazon but also IKEA in the Indian market.
The launch of the new range also comes ahead of Flipkart?s Big Billion Days sale that is slated for next month.
Walmart-backed Flipkart is expanding the furniture category on its platform with the introduction of new sub-brand ‘Pure Wood’ as it looks to compete aggressively against not just its arch-rival Amazon but also IKEA in the Indian market. Flipkart has partnered with solid wood furniture makers in cities like Jaipur and Jodhpur in Rajasthan for Pure Wood, which would be under its private label ‘Perfect Homes’. The collection named Amer, Mehrangarh, Nahargarh, Taragarh and Jaisalmer will be priced between Rs 5,000-70,000.
E-commerce companies focus on private labels because they offer higher margins and enable better control of inventory. “If you see the furniture market in India, it is about USD 15 billion in size. And yet, 90 per cent of it is unorganised. Of the 10 per cent that is organised, online players take up only 10-15 per cent, so there is a huge scope of growth,” Flipkart Senior Director (Private Labels) Shivani Suri said.
She added that estimates (internal and industry) suggest that online channels will account for 25-30 per cent of the organised furniture market by 2020. While she declined to comment on revenue targets, Suri said Pure Wood and Perfect Homes would contribute significantly to the topline from the furniture category.
“Furniture is a difficult category. It’s not just about offering the ‘touch and feel’ experience, customers are looking for quality, durability and affordability. Using consumer insights from our platform, we are getting top designs in quality products at affordable prices for the customer,” she said.
Asked about competition from Swedish giant IKEA, which recently launched its store in Hyderabad, Suri pointed out that the opportunity in the Indian market is huge. “I don’t want to comment on competition, I’m sure they have their own strategy in place. We are focussed on bringing an expansive range to customers, quality products that are affordable, accessible across India with a great service promise,” she said.
IKEA set up its first store in India last month and has plans to open 25 stores by 2025. It is also looking to enter the e-commerce segment by next year, besides exploring small format stores as part of its expansion plans in the country. Within the online category, Flipkart competes with its American rival Amazon as well as players like Urban Ladder and Pepperfry.
The launch of the new range also comes ahead of Flipkart’s Big Billion Days sale that is slated for next month. Suri said furniture under Perfect Homes are available with FurniSure – a certification to assure customers of the quality and durability of the products. The certification, she claimed, is offered after a rigorous test process conducted through NABL-accredited testing laboratories, including Intratek, MTS, BV, and SGS. |
def accuracy_measure(self, test: pd.DataFrame) -> float:
acc = accuracy_score(self.test['label'], self.test['pred'])*100
print("Test accuracy: {:.2f}%".format(acc))
return acc |
/*
* (C) Copyright 2014
* <NAME>, Guntermann & Drunck GmbH, <EMAIL>
*
* SPDX-License-Identifier: GPL-2.0+
*/
#include <common.h>
#include <miiphy.h>
enum {
MIICMD_SET,
MIICMD_MODIFY,
MIICMD_VERIFY_VALUE,
MIICMD_WAIT_FOR_VALUE,
};
struct mii_setupcmd {
u8 token;
u8 reg;
u16 data;
u16 mask;
u32 timeout;
};
/*
* verify we are talking to a 88e1518
*/
struct mii_setupcmd verify_88e1518[] = {
{ MIICMD_SET, 22, 0x0000 },
{ MIICMD_VERIFY_VALUE, 2, 0x0141, 0xffff },
{ MIICMD_VERIFY_VALUE, 3, 0x0dd0, 0xfff0 },
};
/*
* workaround for erratum mentioned in 88E1518 release notes
*/
struct mii_setupcmd fixup_88e1518[] = {
{ MIICMD_SET, 22, 0x00ff },
{ MIICMD_SET, 17, 0x214b },
{ MIICMD_SET, 16, 0x2144 },
{ MIICMD_SET, 17, 0x0c28 },
{ MIICMD_SET, 16, 0x2146 },
{ MIICMD_SET, 17, 0xb233 },
{ MIICMD_SET, 16, 0x214d },
{ MIICMD_SET, 17, 0xcc0c },
{ MIICMD_SET, 16, 0x2159 },
{ MIICMD_SET, 22, 0x00fb },
{ MIICMD_SET, 7, 0xc00d },
{ MIICMD_SET, 22, 0x0000 },
};
/*
* default initialization:
* - set RGMII receive timing to "receive clock transition when data stable"
* - set RGMII transmit timing to "transmit clock internally delayed"
* - set RGMII output impedance target to 78,8 Ohm
* - run output impedance calibration
* - set autonegotiation advertise to 1000FD only
*/
struct mii_setupcmd default_88e1518[] = {
{ MIICMD_SET, 22, 0x0002 },
{ MIICMD_MODIFY, 21, 0x0030, 0x0030 },
{ MIICMD_MODIFY, 25, 0x0000, 0x0003 },
{ MIICMD_MODIFY, 24, 0x8000, 0x8000 },
{ MIICMD_WAIT_FOR_VALUE, 24, 0x4000, 0x4000, 2000 },
{ MIICMD_SET, 22, 0x0000 },
{ MIICMD_MODIFY, 4, 0x0000, 0x01e0 },
{ MIICMD_MODIFY, 9, 0x0200, 0x0300 },
};
/*
* turn off CLK125 for PHY daughterboard
*/
struct mii_setupcmd ch1fix_88e1518[] = {
{ MIICMD_SET, 22, 0x0002 },
{ MIICMD_MODIFY, 16, 0x0006, 0x0006 },
{ MIICMD_SET, 22, 0x0000 },
};
/*
* perform copper software reset
*/
struct mii_setupcmd swreset_88e1518[] = {
{ MIICMD_SET, 22, 0x0000 },
{ MIICMD_MODIFY, 0, 0x8000, 0x8000 },
{ MIICMD_WAIT_FOR_VALUE, 0, 0x0000, 0x8000, 2000 },
};
/*
* special one for 88E1514:
* Force SGMII to Copper mode
*/
struct mii_setupcmd mii_to_copper_88e1514[] = {
{ MIICMD_SET, 22, 0x0012 },
{ MIICMD_MODIFY, 20, 0x0001, 0x0007 },
{ MIICMD_MODIFY, 20, 0x8000, 0x8000 },
{ MIICMD_SET, 22, 0x0000 },
};
/*
* turn off SGMII auto-negotiation
*/
struct mii_setupcmd sgmii_autoneg_off_88e1518[] = {
{ MIICMD_SET, 22, 0x0001 },
{ MIICMD_MODIFY, 0, 0x0000, 0x1000 },
{ MIICMD_MODIFY, 0, 0x8000, 0x8000 },
{ MIICMD_SET, 22, 0x0000 },
};
/*
* invert LED2 polarity
*/
struct mii_setupcmd invert_led2_88e1514[] = {
{ MIICMD_SET, 22, 0x0003 },
{ MIICMD_MODIFY, 17, 0x0030, 0x0010 },
{ MIICMD_SET, 22, 0x0000 },
};
static int process_setupcmd(const char *bus, unsigned char addr,
struct mii_setupcmd *setupcmd)
{
int res;
u8 reg = setupcmd->reg;
u16 data = setupcmd->data;
u16 mask = setupcmd->mask;
u32 timeout = setupcmd->timeout;
u16 orig_data;
unsigned long start;
debug("mii %s:%u reg %2u ", bus, addr, reg);
switch (setupcmd->token) {
case MIICMD_MODIFY:
res = miiphy_read(bus, addr, reg, &orig_data);
if (res)
break;
debug("is %04x. (value %04x mask %04x) ", orig_data, data,
mask);
data = (orig_data & ~mask) | (data & mask);
/* fallthrough */
case MIICMD_SET:
debug("=> %04x\n", data);
res = miiphy_write(bus, addr, reg, data);
break;
case MIICMD_VERIFY_VALUE:
res = miiphy_read(bus, addr, reg, &orig_data);
if (res)
break;
if ((orig_data & mask) != (data & mask))
res = -1;
debug("(value %04x mask %04x) == %04x? %s\n", data, mask,
orig_data, res ? "FAIL" : "PASS");
break;
case MIICMD_WAIT_FOR_VALUE:
res = -1;
start = get_timer(0);
while ((res != 0) && (get_timer(start) < timeout)) {
res = miiphy_read(bus, addr, reg, &orig_data);
if (res)
continue;
if ((orig_data & mask) != (data & mask))
res = -1;
}
debug("(value %04x mask %04x) == %04x? %s after %lu ms\n", data,
mask, orig_data, res ? "FAIL" : "PASS",
get_timer(start));
break;
default:
res = -1;
break;
}
return res;
}
static int process_setup(const char *bus, unsigned char addr,
struct mii_setupcmd *setupcmd, unsigned int count)
{
int res = 0;
unsigned int k;
for (k = 0; k < count; ++k) {
res = process_setupcmd(bus, addr, &setupcmd[k]);
if (res) {
printf("mii cmd %u on bus %s addr %u failed, aborting setup\n",
setupcmd[k].token, bus, addr);
break;
}
}
return res;
}
int setup_88e1518(const char *bus, unsigned char addr)
{
int res;
res = process_setup(bus, addr,
verify_88e1518, ARRAY_SIZE(verify_88e1518));
if (res)
return res;
res = process_setup(bus, addr,
fixup_88e1518, ARRAY_SIZE(fixup_88e1518));
if (res)
return res;
res = process_setup(bus, addr,
default_88e1518, ARRAY_SIZE(default_88e1518));
if (res)
return res;
if (addr) {
res = process_setup(bus, addr,
ch1fix_88e1518, ARRAY_SIZE(ch1fix_88e1518));
if (res)
return res;
}
res = process_setup(bus, addr,
swreset_88e1518, ARRAY_SIZE(swreset_88e1518));
if (res)
return res;
return 0;
}
int setup_88e1514(const char *bus, unsigned char addr)
{
int res;
res = process_setup(bus, addr,
verify_88e1518, ARRAY_SIZE(verify_88e1518));
if (res)
return res;
res = process_setup(bus, addr,
fixup_88e1518, ARRAY_SIZE(fixup_88e1518));
if (res)
return res;
res = process_setup(bus, addr,
mii_to_copper_88e1514,
ARRAY_SIZE(mii_to_copper_88e1514));
if (res)
return res;
res = process_setup(bus, addr,
sgmii_autoneg_off_88e1518,
ARRAY_SIZE(sgmii_autoneg_off_88e1518));
if (res)
return res;
res = process_setup(bus, addr,
invert_led2_88e1514,
ARRAY_SIZE(invert_led2_88e1514));
if (res)
return res;
res = process_setup(bus, addr,
default_88e1518, ARRAY_SIZE(default_88e1518));
if (res)
return res;
if (addr) {
res = process_setup(bus, addr,
ch1fix_88e1518, ARRAY_SIZE(ch1fix_88e1518));
if (res)
return res;
}
res = process_setup(bus, addr,
swreset_88e1518, ARRAY_SIZE(swreset_88e1518));
if (res)
return res;
return 0;
}
|
This invention relates to a new and improved take-out carton and comprises a tray and a cover for the tray and blanks for forming the same. The blanks are planar sheets of paper-board which are folded and secured into a flanged tray and a tapered cover for locking engagement therewith. The blanks and resulting cover and tray may be made of plastic or materials other than paperboard, if desired. The primary purpose of the take-out container of the invention is for transport of hot food items such as those served by fast food establishments.
The carton of the invention is preferably of a rectangular shape. The tray of the carton of the invention has four flanges all of which are outwardly and downwardly directed and are hingedly connected by means of a common fold line to the top of an associated side or end wall of the tray. The fold lines connecting the flanges to their associated side panels create an upward and outward bias on the outwardly and downwardly directed tray flanges.
The cover for the tray is also of a rectangular shape and although it need not be dimensioned precisely to the tray, it is geometrically similar. The cover has two pairs of opposed tapered side walls or closure panels. The lower end of the pair of closure panels of greater length each have a folded back marginal portion secured to the inside thereof. The cover can then be pushed onto a tray such that the associated side flanges of the tray spring slightly upward and back into the indentation formed by the folded back marginal portion of the cover and the side closure panel thereby effectively fastening the cover to the tray.
To release the cover from the tray one grasps the center of the cover side closure panels or walls and pushes the cover down as he pivots and lifts the walls upwardly to allow the flanges of the tray to be released from their edge-to-edge engagement with the folded back marginal portions of the cover.
The end or closure panels of lesser dimension of the cover engage the end flanges of the tray to both aid in effective thermal sealing of the units and to provide a lifting area for the combined units since the cover cannot be released from the tray by lifting at these points. The flanges on the tray ends are of greater width than those along the tray side walls to provide a lifting surface attached directly to the tray itself so that the weight of the carton and its contents may be conveniently and safely supported thereby.
Various take-out cartons of the prior art have been utilized in which the cover is hingedly connected to the tray but this has not been entirely satisfactory since these covers have had a tendency to inadvertently flip down into the food during the user's meal. Furthermore, prior two-part take-out cartons have had the problem of insufficient locking and, therefore, there has been danger of a gust of wind removing the top, for example, when the carton is being carried from a take-out restaurant to an automobile. It has also been a problem in the prior art that take-out cartons necessarily have been supported in the area directly below their food containing cavities such that heat can be transferred directly to the transporter's fingers.
Accordingly, it is the object of this invention to provide an improved two-part take-out carton which provides an effective lock for the cover to the tray which is easily unlocked for total removal of the cover for easy access to the contents of the tray.
An additional object of the invention is an improved two-piece take-out carton in which when the cover is locked on the tray the combination of the cover and tray permit the weight of the carton and its contents to be supported at the ends thereof without the necessity of handling the bottom panel in the area adjacent hot food.
It is a still further object of the invention to provide a take-out carton from paperboard cover and tray blanks which is easy and inexpensive to manufacture but which provides an improved method of locking the cover to the tray and releasing the cover from the tray while providing a handle area so that the cover and tray may be lifted as a unit without unlocking the cover from the tray.
The invention provides a simple mechanical method for joining a flanged tray and tapered cover which accomplishes these objects and provides a thermal barrier to keep the carton contents hot. |
COMBINATION ALISKIREN+AMLODIPINE PROVIDES GREATER REDUCTIONS IN CLINIC AND 24-HOUR AMBULATORY BLOOD PRESSURES THAN AMLODIPINE ALONE IN AFRICAN AMERICANS WITH STAGE 2 HYPERTENSION: PP.5.187 Objective: Ambulatory blood pressure monitoring (ABPM) is a better predictor of cardiovascular outcomes than clinic blood pressure (BP). We previously showed in an 8-week, prospective, multicenter, randomized, double-blind study (N=443) that the combination of aliskiren (A), a direct renin inhibitor, and the calcium channel blocker, amlodipine (AML) (A+AML), provided greater clinic BP reductions than AML in African Americans with baseline mean sitting systolic BP (msSBP) >=160 mmHg and < 200 mmHg. Here we present results in a subset of subjects who underwent ABPM. Design and Methods: After a 14 week washout period, men and women received either A+AML 150/5 mg (n = 76) or AML 5 mg (n = 71) for 1 week, then were force-titrated to receive A+AML 300/10 mg or AML 10 mg for 7 weeks. Results: Baseline mean 24-hour ambulatory SBP (maSBP) was 152.8 mmHg in the A+AML group and 147.5 mmHg in the AML group. At week 8, clinic LSM msSBP reductions from baseline were significantly greater with A+AML (-32.5 mmHg) versus AML (-26.7 mmHg; least squares mean difference P < 0.05). LSM reductions in maSBP were -19.0 mmHg for A+AML versus -15.2 mmHg for AML (p=ns). However, using repeated measure analysis of hourly aSBP and aDBP, LSM changes from baseline were significantly greater with A+AML than with AML (Table). In the overall study, adverse event rates were similar in both groups (35.0% in the A+AML group; 32.7% in the AML group); the most common events were peripheral edema, headache, fatigue, and nausea. Conclusion: Compared with AML alone, the combination of A+AML provided greater reductions in both clinic and ambulatory BP values in African Americans with stage 2 hypertension. Figure 1. No caption available. |
Based on the arguments this week it is certainly possible — more likely than not, I would say — that the Supreme Court will strike down the individual mandate and find that the entire statute must fall as a result. What then?
For starters, it is entirely likely that the left, which usually frowns on court-bashing (who can forget the swooning over former justice Sandra Day O’Connor on this point?), will launch an assault on the Supreme Court and its legitimacy, accusing the five “conservative” justices of “politicizing” the law. As transparent as this may be, the argument will need to explain how the five justices are political and the four (including President Obama’s former solicitor general) are pristine defenders of the law. But logic is not a barrier for those who will kick up a fuss.
“Bush and Citizens United, taken together, are both partisan, political and corrupting,” said Tom Perriello, a former Democratic congressman from Virginia who now runs the liberal Center for American Progress Action Fund. “They cast doubt on the legitimacy of the court.” . . .
Harvard law professor Michael Klarman, who has written two histories of the high court, said the fact that the fight over the health-care law is playing out according to the standard Republican vs. Democrat script — the same script as the 2000 election fight — has eroded the idea that the GOP-appointed court is rooted in restraint and precedent-based impartiality.
The left can never be accused of being unpredictable.
Aside from the left’s excuse-mongering what could we expect?
First, Mitt Romney will be forgiven if he indulges in a big “I told you so,” since he defended his health-care bill on the grounds that the federal government is limited in ways the states are not. But then he would be wise to move forward on two fronts.
First, Romney will likely argue the president spent the lion’s share of his first term on a bill that was unconstitutional. He scared employers, burdened the private sector, frittered away congressional time and energy and engaged in nasty partisanship for nothing. And, meanwhile, the economy goes limping along. Certainly, this argument will capitalize on the sense of demoralization that will certainly overtake the left if its handiwork is eradicated.
Romney, however, will also be under pressure to put forth his own, constitutionally-sound heath-care bill. He’s hinted at pieces (allowing inter-state insurance sales, changing the tax characterization of individually purchased insurance), but it will be incumbent on him to formulate a comprehensive plan. Ironically, his track record of pursuing health-care reform will give him some credibility in a general election and make him less susceptible to the claim that he isn’t interested in expanding health-care coverage.
On the Democratic side, after the anger and denunciations die down, the president will also need to come up with a health-care bill that passes constitutional muster. He ran in 2008, if you recall, on a health-care reform plan that did not include a mandate (he inveighed against Hillary Clinton for suggesting people should be forced to buy insurance), so he’d better dust that one off. As for a single-payer system, that would be the sort of thing to propose once he has “more flexibility” after the election. |
An overview of inverted colloidal crystal systems for tissue engineering. Scaffolding is at the heart of tissue engineering but the number of techniques available for turning biomaterials into scaffolds displaying the features required for a tissue engineering application is somewhat limited. Inverted colloidal crystals (ICCs) are inverse replicas of an ordered array of monodisperse colloidal particles, which organize themselves in packed long-range crystals. The literature on ICC systems has grown enormously in the past 20 years, driven by the need to find organized macroporous structures. Although replicating the structure of packed colloidal crystals (CCs) into solid structures has produced a wide range of advanced materials (e.g., photonic crystals, catalysts, and membranes) only in recent years have ICCs been evaluated as devices for medical/pharmaceutical and tissue engineering applications. The geometry, size, pore density, and interconnectivity are features of the scaffold that strongly affect the cell environment with consequences on cell adhesion, proliferation, and differentiation. ICC scaffolds are highly geometrically ordered structures with increased porosity and connectivity, which enhances oxygen and nutrient diffusion, providing optimum cellular development. In comparison to other types of scaffolds, ICCs have three major unique features: the isotropic three-dimensional environment, comprising highly uniform and size-controllable pores, and the presence of windows connecting adjacent pores. Thus far, this is the only technique that guarantees these features with a long-range order, between a few nanometers and thousands of micrometers. In this review, we present the current development status of ICC scaffolds for tissue engineering applications. |
for _ in range(int(input())):
l1 = input().split()
mark = list(map(int,l1[0:]))
x, y, z = int(0), int(0), int(0)
x = mark[1]
y = mark[2]
z = mark[2]
print(x,y,z)
|
Hydro One has been working to replace a transformer at its substation along County Road 21 after it was destroyed in a fire July 26.
“Crews have been working on an expedited basis since then,” Andrew Spencer, vice president of transmission and station for Hydro One, said during a tour of the facility on Aug. 30.
Crews were working levelling the area where the burned transformer once stood, and where its replacement will be installed.
“We’ve been working seven days a week, with extended shifts,” Spencer said. At the height of the cleanup, there were up to 50 employees working at the normally unmanned station.
“There is monitoring on a 24-hour basis from our centre in Barrie,” Spencer explained.
The substation is one of 300 such facilities Hydro One operates throughout the province.
“A station like this was equipped to have two transformers,” Spencer explained. For the time being it is operating with one, and the utility has brought in two mobile transformers that are being housed at the site in case backup is required. It is the first time that Hydro One has used mobile transformers for such an application.
A new transformer will be transported to the site this fall, and Spencer said it is scheduled to be in service by the first week of November. The cost of the transformer, plus the cost to installing it and getting it operational, is expected to total about $5 million.
When the transformer caught fire on the afternoon of July 26, it was during a severe storm that brought thunder, lightning and large hail stones with it. While lighting strikes can cause such fires, “it’s a little premature to say that was the cause,” Spencer said. While and investigation into the cause of the blaze has been ongoing, “we may never know,” Spencer said.
The Minden substation is scheduled for a rebuild, and it’s likely that work will commence in 2019. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.