The paper presents a proposal for the assessment of the reliability of steel truss (both statically determinate and indeterminate) in the persistent and accidental design situation. In the analysis, a probabilistic approach was used. The global Hasofer-Lind reliability index was employed, computed in successive time steps for the whole structure, not for individual elements. The statically determinate truss was modelled as a serial system from the reliability standpoint. For the statically indeterminate truss, kinematically admissible failure mechanisms were determined by means of the examination of the singularity of the stiffness matrix of the structure, converting the truss into a geometrically variable system. For the problem thus formulated, a serial-parallel reliability model was constructed. Monitoring the reliability index in the successive minutes of the fire makes it possible to estimate the probability of the structure failure, and to decide whether the required safety level is maintained.
Hallmark professionalism in probabilistic analysis is to quantify the uncertainties involved in construction materials subject to intrinsic randomness in its physical and mechanical properties and is now gaining popularity in civil engineering arena. As well, knowledge of behaviour of materials is continuously evolving and its statistical descriptors are also changing when more and more data collected or even data updated and hence reliability analysis has to be carried out with the updated data as a continuous process. As per the committee report ACI 544.2R, it is found that there is no attempt made for probabilistic relation between cube compressive strength and cylinder compressive strength for fiber reinforced concrete. In consequence of this report, a robust relation between cube and cylinder of experimentally conducted compressive strength was established by Monte-Carlo simulation technique for different types of fibrous concrete like steel, alkali resistant glass and polyester fibrous concrete before and after thermoshock considering various uncertainties. Nevertheless simulated probabilistic modals, characteristic modals, optimized factor of safety and allowable designed cylinder compressive strength have been developed from the drawn probability of failure graph, which exhibits robust performance in realistic Civil Engineering materials and structures.
Statistical conformity criteria for the compressive strength of concrete are a matter of debate. The criteria can have prejudicial effects on construction quality and reliability. Hence, the usefulness of statistical criteria for the small sample size n = 3 is questioned. These defects can cause a reduction in the quality of produced concrete and, consequently, too much risk for the recipient (investor). For this reason, the influence of conformity control on the value of the reliability index of concrete and reinforced concrete has been determined. The authors limited their consideration to the recommended standards PN-EN 206-1, PN-EN 1992 and ISO 2394 method of reliability index, which belongs to the analytical methods FORM (First Order Reliability Method). It assumes that the random variables are defined by two parameters of the normal distribution or an equivalent normal: the mean and the standard deviation. The impact of conformity control for n = 3 for concrete structures, designed according to the Eurocode 1992, for which the compressive strength of concrete is the capacity dominant parameter (sensitivity factor of dominating resistance parameter according to the FORM is 0.8), has been determined by evaluation of the reliability index.
Probabilistic analysis of a space truss is presented in the paper. Reliability of such a structure is sensitive to geometrical and material imperfections. The objective of this paper is to present a variant of the point estimate method (PEM) to determine mean values and standard deviations of limit loads of engineering structures. The main advantage presented by this method is the small number of sample calculations required to obtain estimators of investigated parameters. Thus the method is straightforward, requiring only preliminaries of probability theory. This approach is illustrated by limit state analysis of a space truss, considering geometric and material imperfections. The calculations were performed for different random models, so the influence of random parameters on the limit load of the truss can be determined. A realistic snow load was imposed.
The study deals with stability and dynamic problems in bar structures using a probabilistic approach. Structural design parameters are defined as deterministic values and also as random variables, which are not correlated. The criterion of structural failure is expressed by the condition of non-exceeding the admissible load multiplier and condition of non-exceeding the admissible vertical displacement. The Hasofer-Lind index was used as a reliability measure. The primary research tool is the FORM method. In order to verify the correctness of the calculations Monte Carlo and Importance Sampling methods were used. The sensitivity of the reliability index to the random variables was defined. The limit state function is not an explicit function of random variables. This dependence was determined using a numerical procedure, e.g. the finite element methods. The paper aims to present the communication between the STAND reliability analysis program and the KRATA and MES3D external FE programs.
A developed method and measurement setup for measurement of noise generated in a supercapacitor is presented. The requirements for noise data recording are considered and correlated with working modes of supercapacitors. An example of results of low-frequency noise measurements in commercially available supercapacitors are presented. The ability of flicker noise measurements suggests that they can be used to assess quality of tested supercapacitors.
The resistance parameters of timber structures decrease with time. It depends on the type of load and timber classes. Strength reduction effects, referred to as creep-rupture effects, due to long term loading at high stress ratio levels are known for many materials. Timber materials are highly affected by this reduction in strength with duration of load. Characteristic values of load duration and load duration factors are calibrated by means of using probabilistic methods. Three damage accumulation models are considered, that is Gerhard  model, Barret, Foschi and Foshi Yao  models. The reliability is estimated by means of using representative short- and long-term limit states. Time variant reliability aspects are taken into account using a simple representative limit state with time variant strength and simulation of whole life time load processes. The parameters in these models are fitted by the Maximum Likelihood Methods using the data relevant for Polish structural timber. Based on Polish snow data over 45 years from mountain zone in: Zakopane – Tatra, Świeradów – Karkonosze, Lesko – Bieszczady, the snow load process parameters have been estimated. The reliability is evaluated using representative short – and long –term limit states, load duration factor kmod is obtained using the probabilistic model.
The relationship between internal response-based reliability and conditionality is investigated for Gauss-Markov (GM) models with uncorrelated observations. The models with design matrices of full rank and of incomplete rank are taken into consideration. The formulas based on the Singular Value Decomposition (SVD) of the design matrix are derived which clearly indicate that the investigated concepts are independent of each other. The methods are presented of constructing for a given design matrix the matrices equivalent with respect to internal response-based reliability as well as the matrices equivalent with respect to conditionality. To analyze conditionality of GM models, in general being inconsistent systems, a substitute for condition number commonly used in numerical linear algebra is developed, called a pseudo-condition^number. Also on the basis of the SVD a formula for external reliability is proposed, being the 2-norm of a vector of parameter distortions induced by minimal detectable error in a particular observation. For systems with equal nonzero singular values of the design matrix, the formula can be expressed in terms of the index of internal response-based reliability and the pseudo-condition^number. With these measures appearing in explicit form, the formula shows, although only for the above specific systems, the character of the impact of internal response-based reliability and conditionality of the model upon its external reliability. Proofs for complementary properties concerning the pseudo-condition^number and the 2-norm of parameter distortions in systems with minimal constraints are given in the Appendices. Numerical examples are provided to illustrate the theory.
Electric cars (SE) are currently considered to be one of the best ways to reduce CO2 and other air emissions in the transport sector as well as noise in cities. They can reduce the dependency of road transport on imported oil in a visible way. Nevertheless, the demand for electricity for a large amount of SE in road transport is not insignificant and has an impact on the power system. The article analyzes the potential impact of SE on the demand, supply, structure and costs of electricity generation as well as emissions as a result of introducing 1 million SEs by 2025 on Polish roads, and tripling this number by 2035. The competitive electricity market model ORCED was used for the calculations. The results of the analysis indicate that regardless of the charging strategy, the demand for SEs causes a slight increase in the overall electricity demand in Poland and consequently also a slight increase in power generating costs. Even a large increase in SEs in road transport will result in a rather moderate demand for additional generation capacity, assuming that power companies will have some control over the mode of charging cars. The introduction of SEs will not reduce CO2 emissions compared to conventional cars in 2025, on the contrary will increase them regardless of the loading strategy. In 2035 however, the result depends on the charging scenario and both the increase or decrease of emissions is possible. Electric vehicles will increase SO2 net emissions, but they will contribute to a decrease in the net emissions of particulates and NOx.
The aim of the research was the evaluation of wastewater management in terms of stability and efficiency of wastewater treatment, using statistical quality control. For this purpose, the analysis of the operation and operation of the “Kujawy” Sewage Treatment Plant was made, which is one of the most important and largest sewage management facilities in the city of Cracow. This assessment was done using control charts x for 59 observations. The analysed research period covered the multi-year from 2012 to 2016. Five key pollutant indicators were used to evaluate the work of the tested object: BOD5, CODCr, total suspension, total nitrogen and total phosphorus. In the case of the majority of them, based on the analysis of control charts, full stability of their removal was found in the tested sewage management facility. The exception was total nitrogen, for which periods of disturbed stability of its disposal processes were noted. Analysis of the effectiveness of wastewater treatment showed each time that the required efficiency of reduction of the analysed pollution indicators in the “Kujawy” Sewage Treatment Plant was achieved.
Since psychology emerged as an independent field of knowledge, there has been no consensus as to how it should develop, either, in the idiographic or nomothetic way. In the course of time, due to a commitment to what was seen as objectivity in science, the nomothetic approach came to dominate psychology. Thus, researchers used mostly quantitative psychometric methods to establish general rules of human behaviour. In doing so, the essence of nomothetic research is to be extremely careful when interpreting results not to make a reasoning mistake such as the ecological fallacy, as may happen when a researcher draws conclusions about nature of the individual in the group based on average results of the whole group. In the article, we presented two methods for longitudinal research designs which address this problem, and give more idiographic information about participants; via the Reliable Change Index and the Modified Brinley Plot. Finally, we provide a IBM SPSS Statistics syntax automatizing the whole process of computation for these new features.
The article concerns safety of power supply for the final consumers, especially its two comprising elements, which are generation adequacy and distribution system reliability. Generation adequacy has been defined with Loss of Load Probability (LOLP), Loss of Load Expectation (LOLE) and Energy Not Supplied (ENS) indices. Conclusions from generation adequacy forecast prepared by ENSTO-E for Poland compared with other European countries for the years 2020 and 2025 have been discussed along with the resulting threats. Interruptions in energy supply have been characterised by power discontinuity indicator SAIDI. Finally, a reliability and adequacy analysis have been performed for different scenarios of the Polish power system operation in order to assess possibilities of using distributed generation as a backup power source. Based on a simulation model created using the DIgSILENT Power Factory software, the reliability and adequacy calculations have been performed with the probabilistic non-sequential Monte Carlo method and they are followed by a discussion of the obtained results.
The work presents the results of studies on dependence of effectiveness of chosen robust estimation methods from the internal reliability level of a geodetic network. The studies use computer-simulated observation systems, so it was possible to analyse many variants differing from each other in a planned way. Four methods of robust estimation have been chosen for the studies, differing substantially in the approach to weight modifications. For comparative reasons, the effectiveness studies have also been conducted for the very popular method in surveying practice, of gross error detection basing on LS estimation results, the so called iterative data snooping. The studies show that there is a relation between the level of network internal reliability and the effectiveness of robust estimation methods. In most cases, in which the observation contaminated by a gross error was characterized by a low index of internal reliability, the robust estimation led to results being essentially far from expectations.
The paper presents a summary of research activities concerning theoretical geodesy performed in Poland in the period of 2011–2014. It contains the results of research on new methods of the parameter estimation, a study on robustness properties of the M-estimation, control network and deformation analysis, and geodetic time series analysis. The main achievements in the geodetic parameter estimation involve a new model of the M-estimation with probabilistic models of geodetic observations, a new Shift-M split estimation, which allows to estimate a vector of parameter differences and the Shift- M split (+) that is a generalisation of Shift- M split estimation if the design matrix A of a functional model has not a full column rank. The new algorithms of the coordinates conversion between the Cartesian and geodetic coordinates, both on the rotational and triaxial ellipsoid can be mentioned as a highlights of the research of the last four years. New parameter estimation models developed have been adopted and successfully applied to the control network and deformation analysis. New algorithms based on the wavelet, Fourier and Hilbert transforms were applied to find time-frequency characteristics of geodetic and geophysical time series as well as time-frequency relations between them. Statistical properties of these time series are also presented using different statistical tests as well as 2 nd , 3 rd and 4 th moments about the mean. The new forecasts methods are presented which enable prediction of the considered time series in different frequency bands.
In the article problems related to human labor and factors affecting the increasing use of industrial robots are discussed. Since human factors affect the production processes stability, robots are preferred to apply. The application of robots is characterized by higher performance and reliability comparing to human labor. The problem is how to determine the real difference in work efficiency between human operator and robot. The aim of the study is to develop a method that allows clearly definition of productivity growth associated with the replacement of human labor by industrial robots. Another aim of the paper is how to model robotized and manual operated workstation in a computer simulation software. Analysis of the productivity and reliability of the hydraulic press workstation operated by the human operator or an industrial robot, are presented. Simulation models have been developed taking into account the availability and reliability of the machine, operator and robot. We apply OEE (Overall Equipment Effectiveness) indicator to present how availability and reliability parameters influence over performance of the workstation, in the longer time. Simplified financial analysis is presented considering different labor costs in EU countries.
The summary of research activities concerning general theory and methodology performed in Poland in the period of 2015–2018 is presented as a national report for the 27th IUGG (International Union of Geodesy and Geophysics) General Assembly. It contains the results of research on new or improved methods and variants of robust parameter estimation and their application, especially to control network analysis. Reliability analysis of the observation system and an integrated adjustment approach are also given. The identifiability (ID) index as a new measure for minimal detectable bias (MDB) in the observation system of a network, has been introduced. A new method of covariance function parameter estimation in the least squares collocation has been developed. The robustified version of the Shift-Msplit estimation, termed as Shift-M*split estimation, which enables estimation of parameter differences (robustly), without the need of prior estimation of the parameters, has been introduced. Results on the analysis of geodetic time series, particularly Earth orientation parameter time series, geocenter time series, permanent station coordinates and sea level variation time series are also provided in this review paper. The entire bibliography of related works is provided in the references.
To reliably calibrate suitable partial safety factors, useful for the specification of global condition describing structural safety level in considered design case, usually the evaluation of adequate failure probability is necessary. In accidental fire situation, not only probability of the collapse of load-bearing structure, but also another probability related to the people staying in a building at the moment of fire occurence should be assessed. Those values are different one from another in qualitative sense but they are coupled because they are determined by similar factors. The first one is the conditional probability with the condition that fire has already occured, whereas the second is the probability of failure in case of a potential fire, which can take place in the examined building compartment, but its ignition has not yet appeared. An engineering approach to estimate such both probabilities is presented and widely discussed in the article.
The paper presents the method of probabilistic optimisation of load bearing capacity and reliability of statically indeterminate bar structures, and of coupling of members in kinematically admissible failure mechanisms (KAFM), which contain minimal critical sets of elements (MCSE). The latter are characterised by the fact that if only a single element is operational, the whole set is operational too. A method of increasing load bearing capacity and reliability of KAFM built from bars dimensioned in accordance with the code is presented. The paper also shows estimation of load bearing capacity and reliability of KAFM of the optimised structures containing elastic-plastic bars with quasi-brittle connections with nodes. The necessity of increasing connection of load bearing capacity and reliability in relation to bar reliability in order to prevent bars from being excluded from MCSE due to connection fracture is estimated.
The paper provides a solution to the problem of dimensioning decisive bars on the basis of the conditions of meeting the recommended reliability classes  of statically determinate structures composed of n members. A theorem was formulated:if a statically determinate structure composed of n decisive members is to attain the reliability greater than, or equal to, the recommended relia-bility p = 1 – q, it is necessary and suffi cient that the damage frequency sum qᵢ of decisive members is smaller than the admissible damage frequency q of thestructure: ∑qᵢ < q. On the basis of this theorem, s coeffi cients that recommend increase of the load bearing capacity of the decisive bars in a statically determinate structure constructed in order to meet the recommended class  of the structure reliability, are estimated and presented in a tabular form.
From the theory of reliability it follows that the greater the observational redundancy in a network, the higher is its level of internal reliability. However, taking into account physical nature of the measurement process one may notice that the planned additional observations may increase the number of potential gross errors in a network, not raising the internal reliability to the theoretically expected degree. Hence, it is necessary to set realistic limits for a sufficient number of observations in a network. An attempt to provide principles for finding such limits is undertaken in the present paper. An empirically obtained formula (Adamczewski 2003) called there the law of gross errors, determining the chances that a certain number of gross errors may occur in a network, was taken as a starting point in the analysis. With the aid of an auxiliary formula derived on the basis of the Gaussian law, the Adamczewski formula was modified to become an explicit function of the number of observations in a network. This made it possible to construct tools necessary for the analysis and finally, to formulate the guidelines for determining the upper-bounds for internal reliability indices. Since the Adamczewski formula was obtained for classical networks, the guidelines should be considered as an introductory proposal requiring verification with reference to modern measuring techniques.
Low-frequency noise measurements have long been recognized as a valuable tool in the examination of quality and reliability of metallic interconnections in the microelectronic industry. While characterized by very high sensitivity, low-frequency noise measurements can be extremely time-consuming, especially when tests have to be carried out over an extended temperature range and with high temperature resolution as it is required by some advanced characterization approaches recently proposed in the literature. In order to address this issue we designed a dedicated system for the characterization of the low-frequency noise produced by a metallic line vs temperature. The system combines high flexibility and automation with excellent background noise levels. Test temperatures range from ambient temperature up to 300◦C. Measurements can be completely automated with temperature changing in pre-programmed steps. A ramp temperature mode is also possible that can be used, with proper caution, to virtually obtain a continuous plot of noise parameters vs temperature.