Thursday, 10 August 2017

Comparisons of Modeling Approaches for Evaluating the Longitudinal Association in a Clustered Healthcare Intervention Study


biostatistics open access journals
This paper addresses methodology issues related to evidence-based healthcare research, specifically when evaluating and analyzing the hospital practice environments (HPE) impacts on the patient health outcomes are conducted in longitudinal intervention survey studies. HPE include the spatially clustered hospital characteristics, including practice environment scale (PES) measures, hospital facilities, nursing staffing and nursing attributes. The longitudinal associations between HPE and patient smoking cessation counseling (SCC) activities, and patient heart failure (HF) outcomes are examined. Various longitudinal and hierarchical modeling are compared including linear mixed models with restricted maximum likelihood estimation, generalized estimating equations with quasi-likelihood estimation, hierarchical linear regression models with nonparametric generalized least squares estimations, and repeated ANOVA.


Tuesday, 8 August 2017

Verifying ‘Einstein’s Time’ by Using the Equation ‘Time=Distance/ Velocity


The statement ‘Every reference-body (co-ordinate system) has its own particular time’, which appears in Einstein’s book—‘Relativity: The Special and General Theory’, is widely accepted among physicists and even by the general public with the popular interpretation that a clock in a moving body and another clock at rest in the reference stationary body will indicate different values of time.

journal of physical mathematics impact factor
However, upon examining the grounds for this perspective by using the equation ‘time=distance/velocity’ and using ‘the principle of the constancy of the velocity of light’, we find that the above sentence should be arranged as ‘Every reference body (coordinate system) has its own particular measurement of the time interval for the propagation of light and, also it has its own particular measurement of the interval of the light path that must be used in order to calculate its time interval. The numerical value of the ratio of these two intervals is 1:1 always.’ This implies that the pace of ticking of all clocks is identical. This fact contradicts the above popular interpretation.


Wednesday, 26 July 2017

Explicit Calculations of Tensor Product Coefficients for E7


lie algebra journals
We propose a new method to calculate coupling coefficients of E7 tensor products. Our method is based on explicit use of E7 characters in the definition of a tensor product. When applying Weyl character formula for E7 Lie algebra, one needs to make sums over 2903040 elements of E7 Weyl group. To implement such enormous sums, we show we have a way which makes their calculations possible. This will be accomplished by decomposing an E7 character into 72 participating A7 characters.

Monday, 24 July 2017

Lateral Transshipment as an effective approach to cut costs in the multi item small business sector


inventory management journal article
Consumer behavior is unpredictable that retailers often face the challenge of meeting their demands. Those retailers that are in the close proximity with the original manufacturers would generally be able to meet their demands and could provide customer satisfaction. Retailers need to stay in constant touch with the producer for supplies and this coordination is called as lateral transshipment.  Lateral transshipment is the best approach to cut costs and plays an important role in the inventory management.  This approach is more effective in multi item small business sector.

Thursday, 20 July 2017

Notes on the Chern-Character


The aim of this note is to give an axiomatic and elementary treatment of Chern-characters of vector bundles with values in a class of cohomology-theories arising in topology and algebra. Given a theory of Chern-classes for complex vector bundles with values in singular cohomology one gets in a natural way a Chern-character from complex K-theory to singular cohomology using the projective bundle theorem and the Newton polynomials. The Chern-classes of a complex vector bundle may be defined using the notion of an Euler class and one may prove that a theory of Chern-classes with values in singular cohomology is unique. In this note it is shown one may relax the conditions on the theory for Chern-classes and still get a Chern character. Hence the Chern-character depends on some choices.


Monday, 17 July 2017

New Square Method

The “new square method” is an improved approach based on the “least square method”. It calculates not only the constants and coefficients but also the variables’ power values in a model in the course of data regression calculations, thus bringing about a simpler and more accurate calculation for non-linear data regression processes.


mathematics impact factor
In non-linear data regression calculations, the “least square method” is applied for mathematical substitutions and transformations in a model, but the regression results may not always be correct, for which we have made improvement on the method adopted and named the improved one as “new square method”.

Tuesday, 11 July 2017

A New Method for Analysis of Biomolecules Using the BSM-SG Atomic Models


Biomolecules and particularly proteins and DNA exhibit some mysterious features that cannot find satisfactory explanation by quantum mechanical modes of atoms. One of them, known as a Levinthal’s paradox, is the ability to preserve their complex three-dimensional structure in appropriate environments. Another one is that they possess some unknown energy mechanism.

dna research impact factor
The Basic Structures of Matter Super gravitation Unified Theory (BSM-SG) allows uncovering the real physical structures of the elementary particles and their spatial arrangement in atomic nuclei. The resulting physical models of the atoms are characterized by the same interaction energies as the quantum mechanical models, while the structure of the elementary particles influence their spatial arrangement in the nuclei. The resulting atomic models with fully identifiable parameters and angular positions of the quantum orbits permit studying the physical conditions behind the structural and bonding restrictions of the atoms connected in molecules.


Thursday, 6 July 2017

Riccati-Bernoulli Sub-ODE as a new technique to find solutions to the nonlinear equations


algebraic equations impact factor
Riccati-Bernoulli Sub-ODE is a new method to construct the exact traveling wave solutions of the nonlinear modified Korteweg-de Vries (mKdV) equation. This method can be used to solve the nonlinear, random modified Korteweg-de Vries (mKdV) equation. It serves as effective tool to solve many mathematical and physics problems. The travelling wave solutions of these equations can be expressed by hyperbolic functions, trigonometric functions and rational functions. Although several methods are in existence to find out the solution analytical solutions for the linear and nonlinear equations, Jacobi elliptic function method could find the exact solution to the non-linear equations as these equations can be converted into a set of algebraic equations.

Monday, 3 July 2017

Solitary Waves for the Modified Korteweg-De Vries Equation in Deterministic Case and Random Case

physical mathematics journal
In this paper, we present a new method, the so called Riccati-Bernoulli Sub-ODE method to construct exact traveling wave solutions of the nonlinear modified Korteweg-de Vries (mKdV) equation and also,we use this method in order to solve the nonlinear random modified Korteweg-de Vries (mKdV) equation. It has been shown that the proposed method is effective tools to in order to solve many mathematical physics problems. The travelling wave solutions of these equations are expressed by hyperbolic functions, trigonometric functions and rational functions. The impression of the random coefficient in our problem is studied, by using some distributions through some cases studies.

Invariant Tensor Product


journal of generalized lie theory and applications impact factor
Various forms of invariant tensor products appeared in the literature implicitly, for example, in Schur’s orthogonality for finite groups. In many cases, they are employed to study the space HomG(π1, π2) where one of the representations π1 and π2 is irreducible. In this paper, we formulate the concept of invariant tensor product uniformly. We also study the invariant tensor functor associated with discrete series representations for classical groups. For motivations and applications.

Friday, 30 June 2017

From Monge-Ampere-Boltzman to Euler Equations


applied computational mathematics journal
In Hsiao study the convergence of the VPB system to the Incompressible Euler Equations. Bernier and Grégoire show that weak solution of Vlasov-Monge-Ampère converge to a solution of the incompressible Euler equations when the parameter goes to 0, Brenier and Loeper for details. So, is a ligitim question to look for the convergence of a weak solution of BMA (of course if such solution exists) to a solution of the incompressible Euler equations when the parameter goes to 0.
The study of the existence and uniqueness of solution to the BMA system seems a difficult matter. Here we assume the existence and uniqueness of smooth solution to the BMA and we just look to the asymptotic analysis of this system.

Thursday, 29 June 2017

On Finding the Upper Confidence Limit for a Binomial Proportion when Zero Successes are Observed


biometrics open access journals
We consider confidence interval estimation for a binomial proportion when the data have already been observed and x, the observed number of successes in a sample of size n, is zero. In this case, the main objective of the investigator is usually to obtain a reasonable upper bound for the true probability of success, i.e., the upper limit of a one-sided confidence interval. In this article, we use observed interval length and p-confidence to evaluate eight methods for finding the upper limit of a confidence interval for a binomial proportion when x is known to be zero. Long-run properties such as expected interval length and coverage probability are not applicable because the sample data have already been observed. We show that many popular approximate methods that are known to have good long-run properties in the general setting perform poorly when x=0 and recommend that the Clopper-Pearson exact method be used instead.


Wednesday, 28 June 2017

Important Discovery of Preferred Velocity of 30000 ∓ 425 M/S of the Solar Motion of the Earth


With the recent relative advancement of Technology, particularly with the digital storage oscilloscopes, we analyze precursor signals. They became known in the literature, precursor signals. However they are usually attributed to a symmetrical Fourier analysis of square pulses with dual time components, one unnatural time prior the pulse event and second a natural time after the pulse event.

physical mathematics journal
However, every arbitrary mathematical analysis may not be suitable for a particular physical case, as the complex solutions of an equation of a physical problem are usually rejected, as complex numbers solutions. Also, a particular mathematical number may be analyzed as the limit of an infinite number of numerical sequences. This does not mean that every term of each such sequence, plays a real role in a physical situation.

Friday, 23 June 2017

Studies of the Regular and Irregular Iso representations of the Lie-Santilli Isotheory


As it is well known, the Lie theory is solely applicable to dynamical systems consisting of point-like particles moving in vacuum under linear and Hamiltonian interactions (systems known as exterior dynamical systems). One of the authors (R.M. Santilli) has proposed an axiom-preserving broadening of the Lie theory, known as the Lie-Santilli iso theory, that is applicable to dynamical; systems of extended, nonspherical and deformable particles moving within a physical medium under Hamiltonian as well as non-linear and non-Hamiltonian interactions (broader systems known as interior dynamical systems).

journal of generalized lie theory and applications
In this paper, we study apparently for the first time regular and irregular iso representations of Lie-Santilli iso algebras occurring when the structure quantities are constants or functions, respectively. A number of applications to particle and nuclear physics are indicated. It should be indicated that this paper is specifically devoted to the study of iso representations under the assumption of a knowledge of the Lie-Santilli iso theory, as well as of the isotopies of the various branches of 20th century applied mathematics, collectively known as iso mathematics, which is crucial for the consistent formulation and elaboration of iso theories.

Thursday, 22 June 2017

Statistical Process Capability Design to Improve Process Stability of a Molding Machine


Justification of production and manufacturing processes overtime, process capability in the concept of statistical control has been of great importance because it has promoted the production of products that satisfy the expectation of consumers. This paper aimed at promoting the adoption of quality process capability design in a bid to improve the process stability of a process and improve the process performance in the long run of production.

computational mathematics journal
In ensuring this, the technique of design of experiment is adopted using factorial design after which the capability analysis was carried out on the data on plastic containers produced by molding machines as deduced from the control Xbar and Range charts, it was established that the process was observed to be stable and in a state of statistical quality control and the plastic containers produced were observed to differ from one another as a result of variation on the part of the operators of the molding machines.

Wednesday, 21 June 2017

Comparison of the Hemoglobin Amount between Old and Young Persons in Bosnia and Herzegovina

Hemoglobin is a unique protein, which is responsible for oxygen and carbon dioxide transportation all the body. The protein location is inside the erythrocytes and the special oval shape makes it easily pass through blood walls to supply oxygen to the tissues and organs. It is supposed that the hemoglobin amount could change depending on the person's age, gender or nationality. We designed a research to see the molecular differences among Bosnian and Turkish young person`s whose age interval is 18-23 and old person`s age interval is 43-65.

biostatistics journal articles
Totally 300 person`s, 50 from each Bosnian/ Turkish Female/Male and 50 old male and 50 old female were selected for the research. The students` the hemoglobin amount has been recorded individually and presented in a table. As a result of the measurement, The Turkish females average has the lowest hemoglobin Turkish males average shows the maximum amount of hemoglobin, 12.01 g/dl and 14.65 g/dl respectively. When the female gets older their the hemoglobin amount increase in their blood, 7.3% in Bosnian and 12.2% in Turkish. On the other side, the male blood the hemoglobin amount is almost similar by aging, Bosnian male hemoglobin just increase 0.74%, but in Turkish male 1.3% decrease. The result shows that female the hemoglobin amount is affected by age more than male.

Monday, 19 June 2017

Convergence and the Grand Unified Theory

This paper examines the relationship between the four fundamentals forces and shows how, using fluid mechanics, electromagnetism, quantum mechanics and gravity, these forces converge on one solution just as Mathematics does. An equation for the universal processes is provided.

physical mathematics journal
This paper shows how as Mathematics, namely Algebra, Linear Algebra, Geometry and Calculus, converge to one solution, so too does Fluid Mechanics, Quantum Mechanics, Electromagnetism, and Gravity converge to one solution, namely the Super force. We begin with mathematics convergence and end with Cusack’s Universal Equation. Cosmology and Quantum Mechanics are united.

Thursday, 15 June 2017

Non-associative slave-boson decomposition

journal of generalized lie theory and applications impact factor
Everybody knows that the algebra of non-perturbative operators in quantum theory exists but nobody knows its exact form. In this paper the idea is discussed that the constraint (2.2) in t-J model of high-temperature superconductivity is a new generating relation for an algebra of operators the product of which gives us the electron operator. On the perturbative level the algebra of quantum fields is defined by canonical (anti) commutative relations. The algebra of non-perturbative operators should be more complicated and should be generated not only by canonical (anti) commutative relations but should exist other generating relations as well. In this paper we discuss the idea that the constraint (2.2) is an anti associator in a non-associative algebra of quantum non-perturbative operators.

Wednesday, 14 June 2017

An Application of SLA in Small Business Sectors with Multi Item Inventory Management


All type of retailers face an uncertainty of consumer demands. The retailer who having long replenishment lead time from the original supplier and located close to each other are coordinated to prevent stock out and increase service level of their customer. This coordination is simply called lateral transshipment which defines the redistribution of stock from retailer with stock on hand to the retailer who is expecting significant loss due to high risk.

computational mathematics journal
If transportation cost is increased, lateral transshipment is known as a better approach than a policy of no transshipment. The execution of lateral transshipment was done by Young Hae Lee at airport to inspect the airplane. However this is difficult in the companies or in department due to the information extinction and insufficient understanding between companies.

Monday, 12 June 2017

Identifying DNA Methylation Variation Patterns to Obtain Potential Breast Cancer Biomarker Genes

Patterns of DNA methylation in human cells are crucial in regulating tumor growth and can be indicative of breast cancer susceptibility. In our research, we have pinpointed genes with significant methylation variation in the breast cancer epigenome to be used as potential novel biomarkers for breast cancer susceptibility.
hair therapy impact factor
Using the statistical software package R, we compare DNA methylation sequencing data from seven normal individuals with eight breast cancer cell lines. This is done by selecting CG sites, or cytosine-guanine pairings, at which normal cell and cancer cell variation patterns fall in different ranges, and by performing upper one-tailed chi-square tests. These selected CG sites are mapped to their corresponding genes. Using the Consensus Path Database software, we generate genetic pathways with our data to study biological relations between our selected genes and tumorigenic cellular mechanisms.

Contrast Variable Potentially Providing a Consistent Interpretation to Effect Sizes

Various effect sizes have been proposed. However, different effect size measures are suitable for different types of data, and the interpretations of effect sizes are generally arbitrary and remain problematic.

biostatistics and biometrics open access journal impact factor
In this article, the concepts of contrast variable, its standardized mean (SMCV) and c + -probability are explored to link together the commonly used effect sizes including the probabilistic index and ratios of mean difference to variability. A contrast variable can provide both a probabilistic meaning and an index of signal-to-noise ratio to interpret the strength of a comparison, which offers us a strong base to classify the strength of a comparison. Contrast variable, SMCV and c + -probability not only give interpretations to both Cohen’s and McLean’s criteria but also work effectively and consistently for either relationship or group comparison in either independent or correlated situations and in either two or more than 2 groups.



Friday, 9 June 2017

Mathematical Theory of Space-Time

We hypothesize that the Universe contains observable subluminal matter (tardyons and locality) and unobservable superluminal matter (tachyons and nonlocality). By using space-time ring we establish mathematical theory of space-time with subluminal and superluminal coexistence.

physical mathematics journal
The rotating motions of tardyons and tachyons produce centrifugal and centripetal (gravity) forces, respectively. This paper deduces a new gravitational formula and Newtonian gravitational formula from the tardyonic and tachyonic coexistence principle. Through the morphism we first show that tardyons and tachyons are interchangeable, but that tachyons are unobservable . We then convert the tachyonic mass into tardyonic mass We obtain gravitational coefficient η = 6.9 ×10-10.Using it we establish the expansion theory of the universe and suggest the new universe model.


Thursday, 8 June 2017

A canonical semi-classical star product

We study the Maurer-Cartan equation of the pre-Lie algebra of graphs controlling the deformation theory of associative algebras. We prove that there is a canonical solution (choice independent) within the class of graphs without circuits, i.e. at the level of the free operad, without imposing the Jacobi identity.

journal of generalized lie theory and applications
The proof is a consequence of the unique factorization property of the pre-Lie algebra of graphs (tree operad), where composition is the insertion of graphs. The restriction to graphs without circuits, i.e. at “tree level”, accounts for the interpretation as a semi-classical solution. The fact that this solution is canonical should not be surprising, in view of the Hausdorff series, which lies at the core of almost all quantization prescriptions.

Tuesday, 6 June 2017

Fixed Point Theory for Three φ–Weak Contraction Functions

Alber and Guerre-Delabriere demonstrated the “concept of weak contraction” in 1997. Actually in concept of weak contraction, the authors defined such mappings for single-valued maps on Hilbert spaces and proved the existence of fixed points. Rhoades showed that most results of concept of weak contraction are still true for any Banach space along with that he has proved the following very interesting fixed point theorem which is one of generalizations of the Banach contraction principle because it contains contractions as special cases (ϕ(t)=(1–k)t).

Monday, 5 June 2017

A Likelihood Ratio Test for Homogeneity in Circular Data

journal of biometrics and biostatistics impact factor
Testing for the homogeneity of density functions of circular random variables is useful in many settings including the study of wind patterns, paleocurrents trends, the seasonality in human-related events such as homicides and suicides and the seasonality in the appearance of diseases. In this paper, we considered that the density functions are members of the flexible family of circular distributions based on non negative trigonometric (Fourier) sums (series) developed by Fernandez-Duran. We constructed a test based on the likelihood ratio and we applied the proposed test to simulated and real data sets.

Thursday, 1 June 2017

An operadic approach to deformation quantization of compatible Poisson brackets, I

An analogue of the Livernet–Loday operad for two compatible brackets, which is a flat deformation of the bi-Hamiltonian operad is constructed. The Liver net–Lo day operad can be used to define ?-products and deformation quantization for Poisson structures. The constructed operad is used in the same way, introducing a definition of operadic deformation quantization of compatible Poisson structures.

mathematics journal
Some constructions of deformation quantization are known now for the case which was the most important for, namely the algebra of functions on a smooth Poisson manifold; see, for example, the work of Kontsevich. It seems to be much more difficult to deal with deformation quantization of two compatible Poisson brackets – even on the level of introducing the problem and giving the necessary definitions.

Wednesday, 31 May 2017

A New Method on Measure of Similarity between Interval-Valued Intuitionistic Fuzzy Sets for Pattern Recognition

The theory of fuzzy sets, proposed by Zadeh, has gained successful applications in various fields. Measures of similarity between fuzzy sets have gained attention from researchers for their wide applications in real world. Similarity measures are very useful in some areas, such as pattern recognition, machine learning, decision making and market prediction etc. Many measures of similarity between fuzzy sets have been proposed.

computational mathematics journal
Atanassov presented intuitionistic fuzzy sets which are very effective to deal with vagueness. Gau and Buehere researched vague sets. Bustince and Burillo pointed out that the notion of vague sets is same as that of interval-valued intuitionistic fuzzy sets. Chen and Tan proposed two similarity measures for measuring the degree of similarity between vague sets. De et al. defined some operations on intuitionistic fuzzy sets. Szmidt and Kacprzyk introduced the Hamming distance between intuitionistic fuzzy sets and proposed a similarity measure between intuitionistic fuzzy sets based on the distance.

Tuesday, 30 May 2017

Modeling the Log Density Distribution with Orthogonal Polynomials

biostatistics journal articles
Density estimation is one of the most important and difficult statistical problems. There exists a vast literature on the topic and this is not the goal of the paper to provide an overview of all existing approaches. Mainly three methodologies have been developed: First, a non parametric approach can be used, such as kernel density estimation, penalized maximum likelihood or spline density smoothing. Second, finite mixture can be applied to model multi model distributions. Third, polynomials can be used for the log density modelling and estimation. The latter approach, taken in the present paper, is the simplest and can be used at a preliminary stage of density estimation.

Monday, 29 May 2017

Identification of optimal topography of the barotropic ocean model in the North Atlantic by variational data assimilation

journal of physical mathematics impact factor
The use of the data assimilation technique to identify optimal topography is discussed in frames of time-dependent motion governed by nonlinear barotropic ocean model. Assimilation of artificially generated data allows to measure the influence of various error sources and to classify the impact of noise that is present in observational data and model parameters. The choice of length of the assimilation window in 4DVar is discussed. It is shown that using longer window lengths would provide more accurate ocean topography. The topography defined using this technique can be further used in other model runs that start from other initial conditions and are situated in other parts of the model’s attractor.

Friday, 26 May 2017

On irreducible weight representations of a new deformation Uq (sl2) of U (sl2)

Starting from a Hecke R-matrix, Jing and Zhang constructed a new deformation Uq(sl2) of U(sl2) and studied its finite dimensional representations in [Pacific J. Math., 171 (1995), 437-454]. In this note, more irreducible representations for this algebra are constructed. At first, by using methods in non commutative algebraic geometry the points of the spectrum of the category of representations over this new deformation are studied. The construction recovers all finite dimensional irreducible representations classified by Jing and Zhang, and yields new families of infinite dimensional irreducible weight representations.

journal lie theory impact factor
Spectral theory of abelian categories was first initiated by Gabriel in. In particular, Gabriel defined the injective spectrum of any noetherian Grothendieck category. The injective spectrum consists of isomorphism classes of in decomposable injective objects inthe category endowed with the Zariski topology. If R is a commutative noetherian ring, then the injective spectrum of the category of all R-modules is homeomorphic to the prime spectrum of R. This homeomorphism is a part (and the main step in the argument) of the Gabriel’s reconstruction Theorem, according to which any noetherian commutative scheme can be uniquely reconstructed up to isomorphism from the category of quasi-coherent sheaves on it.

Thursday, 25 May 2017

Assessing the Lifestyle of an Extinct Animal Studying its Automatic Nervous System

The titannosaurs became extinct 66 million years ago. Their size restricted their mobility, thermo regulation and blood supply to the organs. The study deployed modeling techniques and statistics to assess the prehistoric physiology, size, metabolism and organ function. 

applied mathematics impact factor
The Automatic nervous system ensured blood flow to the brain through muscle contraction and special boney structures in the neck called cervical ribs. In order to keep face with the huge metabolic needs of the body, the animal could sleep for only 3 hrs a day and had very high body temperatures which increased with activity. There was an in-built cooling mechanism in their carotid arteries to preserve the brain function. This study identified that growth-lines in osteons in the anterior process of the rib grew faster than dense bone and juveniles grew faster than adults Power spectral analyses of growth intervals in osteons showed a ratio of 1.3 (LF/HF) and for bone 1.4 (LF/HF) NS.The life span of the animal was 100 years.

Wednesday, 24 May 2017

Dilatation structures I. Fundamentals

A dilatation structure is a concept in between a group and a differential structure. In this article we study fundamental properties of dilatation structures on metric spaces. This is apart of a series of papers which show that such a structure allows to do non-commutative analysis, in the sense of differential calculus, on a large class of metric spaces, some of them fractals. We also describe a formal, universal calculus with binary decorated planar trees, which underlies any dilatation structure.

journal of generalized lie theory and applications
The purpose of this paper is to introduce dilatation structures on metric spaces. A dilatation structure is a concept in between a group and a differential structure. Any metric space (X,d) endowed with a dilatation structure has an associated tangent bundle. The tangent space at a point is a conical group that is the tangent space has a group structure together with a one-parameter group of auto morphisms. Conical groups generalize Carnot groups, i.e nilpotent groups endowed with a graduation. Each dilatation structure leads to a non-commutative differential calculus on the metric space (X, d).

Tuesday, 23 May 2017

Conformal Geometry in Engineering and Medicine

With the development of 3D geometric acquisition technologies, massive 3D geometric data are ubiquitous today. It is of great challenges to process and analyze 3D geometric data efficiently and accurately. Computational conformal geometry is an emerging inter-disciplinary field, which combines modern geometry with computer science and offers rigorous and practical tools for tackling massive geometric data processing problems. The concepts and methods in conformal geometry play fundamental roles in many fields in engineering and medicine.

applied mathematics open access journal
Conformal geometry studies the invariants under the conformal transformation (angle preserving mapping) group. Conformal geometry is more flexible than Riemannian geometry, and more rigid than topology. Conformal geometry is capable of unifying all shapes in real world to one of three canonical shapes, the sphere, the plane, or the hyperbolic disk; conformal geometric algorithms convert 3D geometric processing problems to 2D image processing problems; furthermore, all surfaces in real life have conformal structures, therefore conformal geometric methods are general. These merits make conformal geometry a powerful tool for real applications.

Friday, 19 May 2017

Power of Permutation Tests Using Generalized Additive Models with Bivariate Smoothers

biometrics impact factor
In spatial epidemiology, when applying Generalized Additive Models (GAMs) with a bivariate locally weighted regression smooth over longitude and latitude, a natural hypothesis is whether location is associated with an outcome, i.e. whether the smoothing term is necessary. An approximate chi-square test (ACST) is available but has aninflated type I error rate. Permutation tests can provide appropriately sized alternatives. This research evaluated powers of ACST and four permutation tests: the conditional (CPT), fixed span (FSPT) and fixed multiple span (FMSPT) and unconditional (UPT) permutation tests.