Operation Sequencing Modeling
Saeed Khalili
Abstract
Considering maintenance strategy in models which schedule and allocate jobs to machines, will make the proposed models compatible with production environments. Furthermore, this will cause higher model efficiency in optimizing the production systems. To this end, a mathematical model for scheduling unrelated ...
Read More
Considering maintenance strategy in models which schedule and allocate jobs to machines, will make the proposed models compatible with production environments. Furthermore, this will cause higher model efficiency in optimizing the production systems. To this end, a mathematical model for scheduling unrelated parallel machines is developed to minimize total weighted completion times. Also in this approach, availability constraints have been considered, and preemption is allowed. Due to executing preventive maintenance and emergency maintenance programs, machine inaccessible times have been added to job completion times. Since the proposed model has high complexity, in order to solve the problem, two meta-heuristic methods including simulated annealing and genetic algorithm are used. In addition, their performances are compared to each other. The results indicate the superiority of simulated annealing over genetic algorithm for this particular problem.
Data mining and related topics
Fatemeh Mirsaeedi; hamidreza koosha; Mohammad Ghodoosi
Abstract
Survey academic performance by educational data mining is one of the most important issues in the field of educational management and researchers focus on it. The purpose of this study is to present an experimental method for appropriate algorithm selection in predicting students' academic status in ...
Read More
Survey academic performance by educational data mining is one of the most important issues in the field of educational management and researchers focus on it. The purpose of this study is to present an experimental method for appropriate algorithm selection in predicting students' academic status in two and three classes. Two-class database predicts the admission or rejection of students in the course, while the database of the three classes, in addition to admission or rejection, identifies students who are prone and elite. Using the previous articles in the field of educational data mining and experts' opinions, factors that effect on academic performance of students were identified and database was compiled based on them. After optimization of parameters and implementation of different algorithms, the performance scores of the algorithms were calculated using paired t-test based on three indexes include of accuracy, f-measure, and ROC, algorithms were compared by TOPSIS and VIKOR methods. In the two-class mode, Support Vector Machine algorithm in TOPSIS with value of 0.999115 and VIKOR with value of zero has shown the best performance. In the multi-class mode, the Logistic Regression algorithm in TOPSIS and VIKOR in turns with values 0.9986044 and 0.0009798 performances better than other algorithms. The proposed method can be used as a tool for selecting algorithm that has the best pergormance in educational data mining. Because choosing the algorithm to achieve accurate and exact results is very effective and can be taken into account in the process of counseling and preventing students' academic failure
Data Envelopment Analyses
Ham,id Reza Yoosefzade; Azam Teimuri; Aghile Heidari
Abstract
The models of Data Envelopment Analysis (DEA) based on Goal Programming (GDEA) seeks to address some drawbacks of classical DEA by increasing the degree of resolution and providing real weights to Decision-Making Units (DMUs). Experimental results indicate that the GDEA models do not completely cope ...
Read More
The models of Data Envelopment Analysis (DEA) based on Goal Programming (GDEA) seeks to address some drawbacks of classical DEA by increasing the degree of resolution and providing real weights to Decision-Making Units (DMUs). Experimental results indicate that the GDEA models do not completely cope with these in some cases which are tested. Also, in calculating the optimal solution with different methods of evaluating the efficiency of units, we are faced with a group of Pareto optimal solutions that make a decision maker facing a serious challenge in choosing the most appropriate solution. To solve this, in the first step, this paper uses the concepts of fuzzy logic and then proposes the F-GDEA approach based on fuzzy logic in solving the GDEA models, which increases the resolution of the methods to rank the units. In the second step, by using the F-GDEA approach, we propose a new hybridized fuzzy approach called HF-GDEA for short, taking into account the various ranking results from the different programming models. With this new proposed approach, we combine the rankings obtained from different methods and present a new ranking for the DMUs. In other words, the HF-GDEA approach makes it possible to compare and thus select an optimal solution from Pareto's optimal solutions set. Finally, the proposed approach is applied to two practical examples and their numerical results are presented.
Location Modeling
Meisam Jafari Eskandari; Hamed Nozari; merdad mokhtari saghinsara
Abstract
In this research, a supply chain network has been designed to address social and corrupt situations. To evaluate the model, a small dimensional example was first designed and the model was solved with 3 decision methods (utility function, comprehensive criteria, and Goal programming). To compare the ...
Read More
In this research, a supply chain network has been designed to address social and corrupt situations. To evaluate the model, a small dimensional example was first designed and the model was solved with 3 decision methods (utility function, comprehensive criteria, and Goal programming). To compare the results of target functions and the effective responses obtained from the two-objective model, we compared the efficiency response indicators (averages of the target functions, the number of efficient responses, the most exponential index, the gap index, the distance index from the ideal point and the computational time). The decision method is a comprehensive criterion for acquiring average indices of the first objective function, the distance indicator from the ideal point, and the computational time more efficient than other methods. The ideal planning method has also proved to be effective in obtaining average indices of the second objective function, the number of effective responses, the most exponential index, and the efficiency gap index. Finally, the utility function method has also been more efficient in obtaining the problem solving index in less time. Finally, for comparing and choosing the most efficient solving method from solvency solving methods from topsis, the method of the comprehensive method is the most efficient method among existing methods.
Data Envelopment Analyses
Reza Maddahi; Hamidreza Yazdani
Abstract
Data Envelopment Analysis (DEA) is a technique for evaluating homogeneous decision-making units. In this method, the efficiency score for each unit is obtained by comparing the performance of each Decision-Making Unit (DMU) with the performance of the other units. This performance score can be used as ...
Read More
Data Envelopment Analysis (DEA) is a technique for evaluating homogeneous decision-making units. In this method, the efficiency score for each unit is obtained by comparing the performance of each Decision-Making Unit (DMU) with the performance of the other units. This performance score can be used as a criterion for ranking units. In many cases, a significant number of units are efficient, and therefore, the efficiency of the classic DEA models cannot be a good criterion for accurate ranking of DMUs. In this study, a method for ranking DMUs using their efficiency is presented in several time periods, so that three types of production possibility sets are introduced. In the first type, for each time period, an independent production possibility set is defined, in the second type, a combination production possibility set is used for all time periods, and in the third type, a community production possibility set is created, which considers all time periods. Then, corresponding to each type, one efficiency number is obtained for each DMU. Therefore, the three values of efficiency resulting from the three methods are combined using the Shannon entropy method and define a general performance criterion for each unit. This criterion is ultimately considered as the main indicator for ranking units.
Fuzzy Optimization
Morteza Goli; Hadi Nasseri; Mehrdad Ghaznavi
Abstract
In this paper, we deal with a linear programming problem with non-symmetric trapezoidal intuitionistic fuzzy numbers. In recent years, many authors have studied the symmetric trapezoidal intuitionistic fuzzy numbers. After defining a ranking function and arithmetic operations on these numbers, they solved ...
Read More
In this paper, we deal with a linear programming problem with non-symmetric trapezoidal intuitionistic fuzzy numbers. In recent years, many authors have studied the symmetric trapezoidal intuitionistic fuzzy numbers. After defining a ranking function and arithmetic operations on these numbers, they solved the intuitionistic fuzzy linear programming problem.But the main problem with their method was that only available for symmetric trapezoidal intuitionistic fuzzy numbers. Now in order to overcome this limitation, in this paper, we present a new arithmetic and a new ordering for non-symmetric trapezoidal intuitionistic fuzzy numbers. Then, we present the general model of an intuitionistic fuzzy linear programming problems and prove a number of important theorems for solving it. Then we present the intuitionistic fuzzy simplex algorithm and finally, by presenting two examples, we will show the application of this new approach and show its superiority over the fuzzy mode.
Strategic Planing
Behrouz Khoshnamak; Suleiman Iranzadeh; Assadaleh Khadivi; Houshang Taghizadeh
Abstract
A mixed approach (qualitative and quantitative) has been used to conduct this research. In the qualitative part of the research, until the achievement of theoretical saturation, with 11 people who were purposefully selected, semi-structured interviews were conducted and the proposed conceptual model ...
Read More
A mixed approach (qualitative and quantitative) has been used to conduct this research. In the qualitative part of the research, until the achievement of theoretical saturation, with 11 people who were purposefully selected, semi-structured interviews were conducted and the proposed conceptual model of the research was formed based on the six stages of the theme analysis approach. The proposed model of this part of the research includes 11 sub-categories: Personality, interaction, environmental awareness, marketing, environmental communications, human capital management, proper management of the general office/branch, general knowledge, specialized knowledge, structure, and content, and 5 main categories: Individual competence, environmental competence, managerial competence, knowledge competence, and organizational competence consisted of 44 concepts. In the quantitative part of the research, the final model derived from the qualitative approach was provided to the research experts in the form of a Delphi questionnaire to determine the most important indicators of the competency of the research organizational positions and during several round-trip stages of the questionnaire and the theoretical consensus of experts. It was determined the index of reform and management of processes and methods of work for the organizational positions of the general manager of the province, the administrative, and financial deputy of the province and the head of the specialized department.
Multi-Attribute Decision Making
Rouhollah Kiani Ghaleh no
Abstract
In the last decade, multi-criteria decision making methods have been used extensively to evaluate multiple units with similar task descriptions. One of the most widely used methods, which is based on mathematical principles, is the TOPSIS method. ranking mechanism in TOPSIS method based on performance ...
Read More
In the last decade, multi-criteria decision making methods have been used extensively to evaluate multiple units with similar task descriptions. One of the most widely used methods, which is based on mathematical principles, is the TOPSIS method. ranking mechanism in TOPSIS method based on performance distance measurement is from positive ideal and negative ideal and the existence of Outlier-data can have a negative impact on the calculations. in this study the modification of TOPSIS method so that Be able to control Outlier-data, is on the agenda. For this purpose modified algorithm TOPSIS method is introduced. With the aim of validating the proposed algorithm, the performance of 1951 branches of agri-Bank in the case study section has been evaluated and the results have been compared with the standard TOPSIS method. Calculation of the modified TOPSIS method by considering different coefficients of data scatter control and examination of the correlation coefficients show that the modified TOPSIS method has been able to control Outlier data well.
Data Envelopment Analyses
Ehsan Momeni; Farhad Hosseinzadeh Lotfi; Reza Farzipoor Saen
Abstract
Nowadays, sustainable development is the most important issue in the economic development of countries. To achieve sustainable development, countries must pay special attention to environmental aspects. Environmental aspect focuses on ecosystem stability and maintenance of ecologic functions. To make ...
Read More
Nowadays, sustainable development is the most important issue in the economic development of countries. To achieve sustainable development, countries must pay special attention to environmental aspects. Environmental aspect focuses on ecosystem stability and maintenance of ecologic functions. To make countries more sustainable, Greenhouse Gas (GHG) emission should be reduced and controlled. Cap and trade approach is one of the most effective approaches in controlling GHG emissions. In the cap and trade approach, the total amount of emissions are decreased by reallocating emission permits to countries. The objective of this paper is to propose a centralized Data Envelopment Analysis (DEA) model to reallocate emission permits in the cap and trade system given countries’ efficiencies. Our model evaluates efficiencies of countries in the presence of discretionary and nondiscretionary inputs to reallocate emission permits. The fuzzy set considers uncertainties in parameters. Also, this paper determines the amount of emitted gases that can be reduced without reducing other outputs. A case study demonstrates the applicability of the proposed model. Sensitivity analysis is carried out to investigate the impact of parameters’ variations on results.
Multi-Attribute Decision Making
Naeme Zarrinpoor; Mohsen Amiri; Mohammad Hadi Nematolahi
Abstract
The green construction industry has been emerged with the urbanization development, the global populatation growth, and growing demand for more sustainable and eco-friendly structures and it has been expanded quickly regarding some benefits such as energy and resources saving, less greenhouse gas emissions, ...
Read More
The green construction industry has been emerged with the urbanization development, the global populatation growth, and growing demand for more sustainable and eco-friendly structures and it has been expanded quickly regarding some benefits such as energy and resources saving, less greenhouse gas emissions, and optimizing the health of residents. The urban development and planning based on green buildings is more complexed compared with the one based on regular buildings, and as a result, it seems necessary to design the accurate and comprehensive planning for identifying risk factors for the success of green buildings projects. This study is presented with the objective of the identification and evaluation of the risk for green buildings based on a real case study in the city of Shiraz. To this end, based on the opinions of experts team including consulting engineers, designers, executors, contractors, and the literature review, 17 criteria are identified as the most important factors and they are classified in 5 groups including policies and standards, economic factors, environmental factors, management factors, and technical and quality factors. The relationship between risk criteria and sub-criteria is studied with DEMATEL procedure and the ranking of risk criteria is done by applying the analytic network process (ANP). The results show that government policies and complicated approval procedures, the project delay, the lack of specific insurance for green buildings and the lack of accurate estimation of investment returning are the most important risk factors of green buildings that urban designers must focus on them to increase the successes of the new emerging green construction industry.
stochastic/Probabilistic/fuzzy/dynamic modeling
Hossein Jafari; Mohammad Javad Ebadi
Abstract
The Cramer-Rao lower bound is obtained by using integration by parts and the Cauchy-Schwarz inequality. The integration by parts formulas of Malliavin calculus plays a role in this study. The point estimation problem is very crucial and has a wide range of applications. When we deal with some concepts ...
Read More
The Cramer-Rao lower bound is obtained by using integration by parts and the Cauchy-Schwarz inequality. The integration by parts formulas of Malliavin calculus plays a role in this study. The point estimation problem is very crucial and has a wide range of applications. When we deal with some concepts such as random variables, the parameters of interest and estimates may be observed as imprecise. Therefore, the theory of fuzzy sets is important in formulating such situations. Using the fuzzy set theory, we define a fuzzy-valued random variable and fuzzy stochastic process. We use the Malliavin derivative and Skorohod integral to study the asymptotic properties of the statistical model for fuzzy random variables. We show how to use the conditional expectations of certain expressions to derive Cramer-Rao lower bounds for Fuzzy valued Random Variables that they do not require the explicit expression of the likelihood function. As an example, we consider a fuzzy random sample of size n induced by independent standard normally distributed random variables with fuzzy parameter.
Scheduling Modeling
Habibeh Nazif; Khadijeh Ghaziani
Abstract
The timetable is the problem of placing particular resources due to constraints in a limited number of times lots and space, in order to satisfy a set of goals that is used to a variety of problems. Among these problems, one can point out the University Examination Timetabling Problem (UETP), which is ...
Read More
The timetable is the problem of placing particular resources due to constraints in a limited number of times lots and space, in order to satisfy a set of goals that is used to a variety of problems. Among these problems, one can point out the University Examination Timetabling Problem (UETP), which is the particular importance in educational problems. The university examination timetabling problem defined as the assignment of a certain set of exams to a fixed number of time slots and rooms, so that it meets all the hard constraints, also soft constraints are optimized as much as possible. This research presents a modified approach to optimize the incapacitated UETP. In this approach, a proposed Genetic Algorithm (GA) is modified by local search operators. These operators will make alterations to the timetable. This involves shifting or changing scheduled exams and thus greatly improve the ability of the algorithm to search. The efficiency of the proposed approach is compared with other techniques from literature using the Carter’s benchmark. The computational results show that this approach is quite effective and competitive in improving the solutions and is able to produce better solutions in most of the datasets compared with other algorithms.
Fuzzy Optimization
Morteza Shafiee; Hilda Saleh; Atefeh Kaveh
Abstract
Assessing the reliability and availability of the production system reduces the likelihood of sudden and costly stops, which is very risky. To this end, this paper attempts to provide a new way to determine the reliability and availability of a production system, which can be used for a variety of failure ...
Read More
Assessing the reliability and availability of the production system reduces the likelihood of sudden and costly stops, which is very risky. To this end, this paper attempts to provide a new way to determine the reliability and availability of a production system, which can be used for a variety of failure components such as materials, supplies, personnel and machinery. Therefore, using Fuzzy Bayesian Approach, unrealistic events that are actually created in a production system have been processed and the proposed model has been used to assess the condition of Pegah Fars milk factory, so that the rate of failure and repair and reliability of components and system with Bayesian method was calculated. And because the available information space is uncertain, the reliability parameters became fuzzy. Then the availability of the components and then the whole system was calculated using the formula provided by Martz and Waller and the Bayesian method, and the parameters of availability were converted into fuzzy. Finally, the information obtained about the reliability and availability of the system and components were analyzed, that the results show the improved approach, provides a more accurate estimate of reliability and availability.
Multi-Attribute Decision Making
Hojat Sharifpour; Hasanali Aghajani; abdolhamid safaei ghadikolaei
Abstract
Today, organizations need to develop their production systems to achieve a competitive advantage and maintain it in the market. To achieve this, a completely new and innovative approach is needed to manage and produce an organization that dramatically increases productivity and also helps create effective ...
Read More
Today, organizations need to develop their production systems to achieve a competitive advantage and maintain it in the market. To achieve this, a completely new and innovative approach is needed to manage and produce an organization that dramatically increases productivity and also helps create effective and efficient supply chains. Now that the industrial revolution has grown to the fourth generation, organizations can use its new technologies to achieve and maintain a competitive advantage, as well as to increase productivity. The present study examines the interactive relationships of fourth-generation industry technologies in selected food industries to achieve this; first, through literature extraction, 20 cases of fourth-generation industry technologies were extracted, and in the second stage, with the fuzzy Delphi approach, 13 cases of fourth-generation industry technologies were identified for an explanation in the food industry. Then, the R-DEMATEL was used to investigate the interactive relationships between fourth-generation industry technologies.
Decisions in new businesses
Elham Fazelli Veisari; mohamad javad Taghipourian; Reza Tavoli; Ghydar Ghanbarzade
Abstract
The purpose of this study is to identify the components and develop a model to provide rules for optimizing viral marketing in businesses. It is an applied research and in terms of method, it is mixed (quantitative and qualitative). The statistical population of the research in the qualitative part includes ...
Read More
The purpose of this study is to identify the components and develop a model to provide rules for optimizing viral marketing in businesses. It is an applied research and in terms of method, it is mixed (quantitative and qualitative). The statistical population of the research in the qualitative part includes 15 people in the three generations X, Y and Z (Millennium marketing generation) and in the quantitative part includes 460 online buyers. Data collection tools were used in the qualitative part of projection technique and in-depth interview. Interviews were analyzed and summarized using MAXQDA software, through which six components were identified, and then in a small part of 12 experts were used to determine the index of CVR, and then exploratory factor analysis was performed by SPSS software. Because selecting the most effective new components of viral marketing can have a huge impact on the accuracy of the viral marketing model in online businesses, To identify the most effective components, genetic metaheuristic algorithm was used, which is the software used in this section, WEKA and RAPIDMINER. Finally, the rules of viral marketing optimization were identified using the decision tree method. Findings in the qualitative section indicate that online persuasion, online trust, online support, online services, online attractiveness and online risk-taking are components of viral marketing. In the quantitative section and genetic algorithm, it was shown that the online risk component could not be used as an effective component for modeling and extracting viral marketing rules, so it was removed from the six components.
Mathematical Optimization Models
Hassan Rashidi
Abstract
Many of the world's top universities have already decided to hold the next semester with e-education. In our country, the forecasts show a red situation for some areas in terms of the prevalence of corona. Also, a number of university students are living in these areas. Therefore, in the planning of ...
Read More
Many of the world's top universities have already decided to hold the next semester with e-education. In our country, the forecasts show a red situation for some areas in terms of the prevalence of corona. Also, a number of university students are living in these areas. Therefore, in the planning of the next semester, more attention should be paid to e-education. It is recommended that the next semester be implemented in two parts, including e-education for 10 weeks and face-to-face training for 3 weeks. In face-to-face training with the needs of the educational and dormitory space, students are divided into two sub-categories (A) and (B) so that it is possible to implement health protocols in universities and dormitories. In this paper, to determine the number of male and female students accommodated in dormitories for both subcategories (A) and (B) in face-to-face training, a mathematical optimization model is proposed in the form of nonlinear programming with integer decision variables. In the objective function of the model, the distribution of students in the educational space and dormitories is done in such a way that it has the maximum possible dispersion (minimum difference), due to the prevention of the spread of corona disease. This model has been implemented for the allocation of student dormitories at Allameh Tabatabai University, and its use can bring positive results for decision makers.
Non-linear Optimization
Zohreh Akbari
Abstract
In this paper, we present a new trust region method for unconstrained optimization problems with locally Lipschitz continuous, nonconvex functions. In this method, in the ratio test, the current objective function value is replaced with maximum of some objective function values in the previous iterations. ...
Read More
In this paper, we present a new trust region method for unconstrained optimization problems with locally Lipschitz continuous, nonconvex functions. In this method, in the ratio test, the current objective function value is replaced with maximum of some objective function values in the previous iterations. The new method nonmonotone properties and prevents falling into narrow valleys. Proving global convergence requires only two conditions: 1- there should be is a sufficient reduction for the approximate model in the solution of trust region subproblem, 2- the approximation Hessian matrix be bounded. Then, the convergence property of this method is investigated. Finally, the presented method is implemented on some nonconvex problems in MATLAB environment and numerical results are compared with the nonsmooth trust region method.
Data Envelopment Analyses
Hossein Azizi
Abstract
Research has revealed that Data Envelopment Analysis (DEA) is an excellent method of data-based performance analysis for comparing decision-making units with multiple inputs and outputs. Selecting inputs and outputs (performance measures) in DEA is a delicate task. In principle, including ...
Read More
Research has revealed that Data Envelopment Analysis (DEA) is an excellent method of data-based performance analysis for comparing decision-making units with multiple inputs and outputs. Selecting inputs and outputs (performance measures) in DEA is a delicate task. In principle, including a large number of inputs and outputs is a positive advantage. However, the inclusion of multiple inputs and outputs might translate into a great deal more of additional data being included, and this may lead to some decision-making units being considered and designated as efficient simply because of their high performance in relation to a number of redundant and useless variables. Elsewhere, in some situations, some performance measures can play both an input and output role. These performance measures are called flexible measures or dual-role factors. Even though models have been developed for working with such dual-role factors, this paper proposes performance appraisal from both an optimistic and pessimistic perspective for selecting a third-party reverse logistics provider in the presence of multiple dual-role factors. A numerical example illustrates the application of the proposed approach.
Fuzzy Optimization
Malihe Niksirat
Abstract
Purpose: During the Corona virus epidemic and in order to comply with the rules of social distancing, public transport operators have to operate with less capacity. Because demand may be overcapacity in different areas at different times of the day, drivers are forced to refrain from serving passengers ...
Read More
Purpose: During the Corona virus epidemic and in order to comply with the rules of social distancing, public transport operators have to operate with less capacity. Because demand may be overcapacity in different areas at different times of the day, drivers are forced to refrain from serving passengers at certain stations to avoid overcrowding.Methodology: The purpose of this paper is to develop decision support tools to prevent congestion of vehicles. Also, in order to consider the real conditions, two types of fuzzy and scenario-based uncertainty are considered. A dynamic nonlinear integer programming model is introduced to obtain the optimal service pattern for vehicles that are ready to be dispatched. To overcome the combined uncertainty of the problem, possibility theory has been proposed as a new fuzzy stochastic programming approach that has significant advantages.Findings: The model is clearly strikes a balance between observing social distancing by reducing the capacity of vehicles and reducing the waiting time of passengers who lose services. Numerical examples are provided to illustrate the proposed concepts and model and to compare the results.Originality/Value: The proposed decision support model can suggest service patterns for different lines service and can assess public transport operators to evaluate the advantages and disadvantages of implementing epidemic-based service patterns due to operational advances and demand level of travelers.
Multi-Attribute Decision Making
Abbas Jahangiri
Abstract
Service organizations such as water and wastewater companies need feedback and performance evaluation to deliver better services. The purpose of this study was to analyze trend of water supply to Iranian cities and villages and wastewater disposal from them during the years 2012 to 2018. For this purpose, ...
Read More
Service organizations such as water and wastewater companies need feedback and performance evaluation to deliver better services. The purpose of this study was to analyze trend of water supply to Iranian cities and villages and wastewater disposal from them during the years 2012 to 2018. For this purpose, in this descriptive-cross-sectional study that required data were adapted from the statistical yearbook of the national water and wastewater company, performance of four areas include urban water supply, rural water supply, urban wastewater disposal and rural wastewater disposal over 7 consecutive years were evaluated by using one of the newest multiple attribute decision making methods called Measurement Alternatives and Ranking according to COmpromise Solution (MARCOS) with considering each year as an alternative and using time series analysis and linear fitting with the help of Excel and SPSS software. The results, with a 99% confidence level, showed that the performance of all four areas has significantly increased.
Fuzzy Optimization
Gohar Shakouri; Seyed Hadi Nassery; Mohammad Mahdi Paydar
Abstract
Purpose: The transportation problem, as one of the most important and most practical models related to linear programming, has always been of interest to researchers. Due to the lack of accurate information, variable economic conditions, uncontrollable factors and especially variable conditions of available ...
Read More
Purpose: The transportation problem, as one of the most important and most practical models related to linear programming, has always been of interest to researchers. Due to the lack of accurate information, variable economic conditions, uncontrollable factors and especially variable conditions of available resources, to adapt to the real conditions, we are faced with a kind of uncertainty, both flexibility in constraints and fuzzy nature of the parameters. Hence, one method to express the conditions of this modeling is to use flexible fuzzy numbers that make it more adaptable to real conditions.Methodology: In this research, after reviewing the research literature, the transportation problem is modeled by considering the flexible-interval fuzzy supply constraint. Then, for the solution process, a flexible fuzzy approach to the proposed model is studied.Findings: Numerical example analysis indicates that parametric linear programming approach offers a reliable design so that the decision maker can obtain a better selection of resources with the most satisfaction.Originality/Value: In this research, parametric approach with flexible relationship is discussed and based on the research results, the solution is obtained with the most satisfaction in constraints.
Location Modeling
Sepideh Taghikhani; Fahimeh Baroughi; Behrooz Alizadeh
Abstract
In this paper, the -product and t-state uncapacitated facility location problem is investigated. To be more precise, it is assume that each customer can request different products in a -state network. First, the mathematical formulation for the -product and t-state uncapacitated facility location ...
Read More
In this paper, the -product and t-state uncapacitated facility location problem is investigated. To be more precise, it is assume that each customer can request different products in a -state network. First, the mathematical formulation for the -product and t-state uncapacitated facility location problem with certain costs is proposed. Also, it is shown that this paoblem is NP-hard. Since in most real-world problems the input of data are often ambiguous and uncertain, we study the -product and -state uncapacitated facility location problem in which the facility set-up costs and customer service costs are fuzzy random variables. Using three criteria, probability-possibility, probability-necessity and probability-credibility, the -product and -state uncapacitated facility location problem is formulated as a quadratic programming. Finally, a practical example is given to illustrate the efficiency of the proposed approaches.
Linear Optimization
Younes Nozarpour; Sayyed Mohammad Reza Davoodi; Mahdi Fadaee
Abstract
Purpose: The multi-period portfolio after closing, can be reviewed and modified at regular intervals. The philosophy behind using multi-period stock portfolio models is that investors often have a multi-period view of future asset changes that can be derived from technical, fundamental, or statistical ...
Read More
Purpose: The multi-period portfolio after closing, can be reviewed and modified at regular intervals. The philosophy behind using multi-period stock portfolio models is that investors often have a multi-period view of future asset changes that can be derived from technical, fundamental, or statistical models. In conventional multi-period portfolio models, it is assumed that the forecast and correction horizons are the same for all assets. However, one asset may be predicted for the one-month horizon and another for the two-month horizon, and may be revised in the future in these periods. The purpose of this study is to present a multi-period stock portfolio model in which assets have different time horizons for correction or an asset can not be traded for the first few periods and then enter the correction cycle.Methodology: In this model, uncertainty variables defined on an uncertainty space are used to describe the returns. The objective function of the model is to maximize the ultimate wealth of the portfolio, and to limit portfolio risk, a constraint is used in which the uncertainty of the ultimate wealth below a threshold is controlled at a confidence level. To find the optimal solution, the model is converted into a form of linear programming by a change of variable method.Findings: After explaining how to model the research portfolio, using a numerical example the model is implemented on two portfolios with 6 and 10 stocks and 4 monthly time steps on the Tehran Stock Exchange.Originality/Value: The present study extends the uncertain multi-period portfolio to a multi-period portfolio with different time horizons and offers an optimal solution through linear programming. In the research stock portfolio, transaction costs are also considered to be more in line with the real conditions.
Forecasting Models/ Time Series
Sepideh Etemadi; Mehdi Khashei
Abstract
Purpose: The purpose of this paper is to present a new methodology for statistical modeling, which, unlike all commonly developed models and algorithms, maximizes the reliability of the results instead of the resulting accuracy. Accordingly, a new class of statistical modeling approaches has been developed ...
Read More
Purpose: The purpose of this paper is to present a new methodology for statistical modeling, which, unlike all commonly developed models and algorithms, maximizes the reliability of the results instead of the resulting accuracy. Accordingly, a new class of statistical modeling approaches has been developed by replacing conventional processes with the proposed process.Methodology: The multiple linear regression method has been selected to implement the proposed methodology in this paper. To comprehensively evaluate the performance of the proposed regression model, 10 standard datasets from the literature on statistical modeling have been considered.Findings: Overall, the results show that in 65% of the studied data sets, the proposed model can generalize more than the usual multiple linear regression. The proposed regression model, on average, has been able to improve the accuracy of the modeling by 5.571% and 6.466% in mean absolute error and mean square error, respectively, compared to its classic version. These results clearly show the significant effect of reliability of the results on the degree of generalizability, which is basically not considered in the usual statistical modeling processes.Originality/Value: Statistical modeling is one of the most important tools for simulating real-world systems and data sets that are often used to make decisions in a wide range of applications. Several different approaches have been developed in the literature with different features to cover real-world issues with the desired accuracy. However, such methods follow a similar concept and idea in the modeling process. The performance basis in all conventional statistical modeling approaches is based on the assumption that maximum accuracy in experimental and inaccessible data will be obtained from models with minimization of error in training data. Although this is a logical and standard procedure in traditional statistical modeling spaces, it is not the unique way to achieve maximum generalizability. In other words, the generalizability of the model simultaneously depends on the model's accuracy and the level of results' reliability. In this paper, a new methodology for statistical modeling is presented, which, unlike all commonly developed models and algorithms, maximizes the reliability of the results instead of the resulting accuracy.
supply chain management analyzing/modelling
Mohammad Hossein Darvish Motevali; Majid Motamedi
Abstract
Data envelopment analysis is the most widely used mathematical model to evaluate the efficiency of decision units. Classical and simple network models in data envelopment analysis are not able to calculate the efficiency of multi-stage and sequential supply networks. These types of networks include several ...
Read More
Data envelopment analysis is the most widely used mathematical model to evaluate the efficiency of decision units. Classical and simple network models in data envelopment analysis are not able to calculate the efficiency of multi-stage and sequential supply networks. These types of networks include several successive structures and components, such as the 5-level supply network, which are used in many strategic industries. The distinction between this structure and the sequence of supply networks with several components over time as well as the relationship between the efficiency of a time period and the total efficiency during a time period is examined. The purpose of this paper is modeling in the form of developing a non-radial SBM model and presenting a dynamic data envelopment analysis model to evaluate the performance of a sustainable supply network as a wide and multi-level network so that efficiency can be calculated at five levels of supply network as well as periods. Provide consecutive. The model presented in the 5-level supply network in the cement industry has been validated. The results showed that the new model performs a logical and close to reality evaluation compared to the classic and static network models.