Les Inscriptions à la Bibliothèque sont ouvertes en
ligne via le site: https://biblio.enp.edu.dz
Les Réinscriptions se font à :
• La Bibliothèque Annexe pour les étudiants en
2ème Année CPST
• La Bibliothèque Centrale pour les étudiants en Spécialités
A partir de cette page vous pouvez :
Retourner au premier écran avec les recherches... |
Journal of the operational research society (JORS) / Wilson, John . Vol. 61 N° 3Journal of the operational research society (JORS)Mention de date : Mars 2010 Paru le : 07/09/2011 |
Dépouillements
Ajouter le résultat dans votre panierAn overview and framework for PD backtesting and benchmarking / G. Castermans in Journal of the operational research society (JORS), Vol. 61 N° 3 (Mars 2010)
[article]
in Journal of the operational research society (JORS) > Vol. 61 N° 3 (Mars 2010) . - pp. 359–373
Titre : An overview and framework for PD backtesting and benchmarking Type de document : texte imprimé Auteurs : G. Castermans, Auteur ; D. Martens, Auteur ; Van Gestel, T., Auteur Année de publication : 2011 Article en page(s) : pp. 359–373 Note générale : Recherche opérationnelle Langues : Anglais (eng) Mots-clés : Quantitative validation Basel II Credit scoring Traffic light Index. décimale : 001.424 Résumé : In order to manage model risk, financial institutions need to set up validation processes so as to monitor the quality of the models on an ongoing basis. Validation can be considered from both a quantitative and qualitative point of view. Backtesting and benchmarking are key quantitative validation tools, and the focus of this paper. In backtesting, the predicted risk measurements (PD, LGD, EAD) will be contrasted with observed measurements using a workbench of available test statistics to evaluate the calibration, discrimination and stability of the model. A timely detection of reduced performance is crucial since it directly impacts profitability and risk management strategies. The aim of benchmarking is to compare internal risk measurements with external risk measurements so as to better gauge the quality of the internal rating system. This paper will focus on the quantitative PD validation process within a Basel II context. We will set forth a traffic light indicator approach that employs all relevant statistical tests to quantitatively validate the used PD model, and document this approach with a real-life case study. The set forth methodology and tests are the summary of the authors’ statistical expertise and experience of world-wide observed business practices. DEWEY : 001.424 ISSN : 0160-5682 En ligne : http://www.palgrave-journals.com/jors/journal/v61/n3/abs/jors200969a.html [article] An overview and framework for PD backtesting and benchmarking [texte imprimé] / G. Castermans, Auteur ; D. Martens, Auteur ; Van Gestel, T., Auteur . - 2011 . - pp. 359–373.
Recherche opérationnelle
Langues : Anglais (eng)
in Journal of the operational research society (JORS) > Vol. 61 N° 3 (Mars 2010) . - pp. 359–373
Mots-clés : Quantitative validation Basel II Credit scoring Traffic light Index. décimale : 001.424 Résumé : In order to manage model risk, financial institutions need to set up validation processes so as to monitor the quality of the models on an ongoing basis. Validation can be considered from both a quantitative and qualitative point of view. Backtesting and benchmarking are key quantitative validation tools, and the focus of this paper. In backtesting, the predicted risk measurements (PD, LGD, EAD) will be contrasted with observed measurements using a workbench of available test statistics to evaluate the calibration, discrimination and stability of the model. A timely detection of reduced performance is crucial since it directly impacts profitability and risk management strategies. The aim of benchmarking is to compare internal risk measurements with external risk measurements so as to better gauge the quality of the internal rating system. This paper will focus on the quantitative PD validation process within a Basel II context. We will set forth a traffic light indicator approach that employs all relevant statistical tests to quantitatively validate the used PD model, and document this approach with a real-life case study. The set forth methodology and tests are the summary of the authors’ statistical expertise and experience of world-wide observed business practices. DEWEY : 001.424 ISSN : 0160-5682 En ligne : http://www.palgrave-journals.com/jors/journal/v61/n3/abs/jors200969a.html Cyclical adjustment of point-in-time PD / S. Ingolfsson in Journal of the operational research society (JORS), Vol. 61 N° 3 (Mars 2010)
[article]
in Journal of the operational research society (JORS) > Vol. 61 N° 3 (Mars 2010) . - pp. 374–380
Titre : Cyclical adjustment of point-in-time PD Type de document : texte imprimé Auteurs : S. Ingolfsson, Auteur ; B. T. Elvarsson, Auteur Année de publication : 2011 Article en page(s) : pp. 374–380 Note générale : Recherche opérationnelle Langues : Anglais (eng) Mots-clés : Banking Risk Capital budgeting Time series Forecasting Index. décimale : 001.424 Résumé : Banking regulation stipulates that to calculate minimum capital requirements a long-term average of annual default probability (PD) should be used. Typically, logistic regression is applied with a 12-month sample period to obtain retail PD estimates. Thus the output will reflect the default rate in the sample, and not the long-term average. The ensuing calibration problem is addressed in the paper by a ‘variable scalar methodology’, based on an actual application in a commercial bank. Using quarterly intra-bank loss data over 15 years, a state-space model of the credit cycle is estimated by a Kalman filter, resulting in a structural decomposition of the credit cycle. This yields an adjustment factor for each point in the cycle for each of two client segments. The regulatory compliance aspects of such a framework, as well as some practical issues are presented and discussed. DEWEY : 001.424 ISSN : 0160-5682 En ligne : http://www.palgrave-journals.com/jors/journal/v61/n3/abs/jors2009136a.html [article] Cyclical adjustment of point-in-time PD [texte imprimé] / S. Ingolfsson, Auteur ; B. T. Elvarsson, Auteur . - 2011 . - pp. 374–380.
Recherche opérationnelle
Langues : Anglais (eng)
in Journal of the operational research society (JORS) > Vol. 61 N° 3 (Mars 2010) . - pp. 374–380
Mots-clés : Banking Risk Capital budgeting Time series Forecasting Index. décimale : 001.424 Résumé : Banking regulation stipulates that to calculate minimum capital requirements a long-term average of annual default probability (PD) should be used. Typically, logistic regression is applied with a 12-month sample period to obtain retail PD estimates. Thus the output will reflect the default rate in the sample, and not the long-term average. The ensuing calibration problem is addressed in the paper by a ‘variable scalar methodology’, based on an actual application in a commercial bank. Using quarterly intra-bank loss data over 15 years, a state-space model of the credit cycle is estimated by a Kalman filter, resulting in a structural decomposition of the credit cycle. This yields an adjustment factor for each point in the cycle for each of two client segments. The regulatory compliance aspects of such a framework, as well as some practical issues are presented and discussed. DEWEY : 001.424 ISSN : 0160-5682 En ligne : http://www.palgrave-journals.com/jors/journal/v61/n3/abs/jors2009136a.html Estimation error in regulatory capital requirements / P. Beling in Journal of the operational research society (JORS), Vol. 61 N° 3 (Mars 2010)
[article]
in Journal of the operational research society (JORS) > Vol. 61 N° 3 (Mars 2010) . - pp. 381–392
Titre : Estimation error in regulatory capital requirements : theoretical implications for consumer bank profitability Type de document : texte imprimé Auteurs : P. Beling, Auteur ; G. Overstreet, Auteur ; K. Rajaratnam, Auteur Année de publication : 2011 Article en page(s) : pp. 381–392 Note générale : Recherche opérationnelle Langues : Anglais (eng) Mots-clés : Basel II Economic capital Consumer credit Index. décimale : 001.424 Résumé : Despite the topic's societal importance and despite progress in bank research, a lack of consensus exists concerning either the desirability of bank regulation or its optimal design. Enforcement of minimum bank capital standards has been shown to enhance bank stability, but also serves as a potential source of incremental costs, some of which are subtle. Such widely ambiguous research results point to the need for theoretical research regarding capital regulation across diverse banking systems. Along the latter lines, consumer bank issues have been generally neglected. This paper theoretically examines the performance implications of misestimating the regulatory capital requirement for a stylised consumer bank. For our stylised consumer bank, we prove that misestimation, irrespective of its direction, results in lower economic profits and, hence, value. Conclusions and implications for future work are drawn. DEWEY : 001.424 ISSN : 0160-5682 En ligne : http://www.palgrave-journals.com/jors/journal/v61/n3/abs/jors2009109a.html [article] Estimation error in regulatory capital requirements : theoretical implications for consumer bank profitability [texte imprimé] / P. Beling, Auteur ; G. Overstreet, Auteur ; K. Rajaratnam, Auteur . - 2011 . - pp. 381–392.
Recherche opérationnelle
Langues : Anglais (eng)
in Journal of the operational research society (JORS) > Vol. 61 N° 3 (Mars 2010) . - pp. 381–392
Mots-clés : Basel II Economic capital Consumer credit Index. décimale : 001.424 Résumé : Despite the topic's societal importance and despite progress in bank research, a lack of consensus exists concerning either the desirability of bank regulation or its optimal design. Enforcement of minimum bank capital standards has been shown to enhance bank stability, but also serves as a potential source of incremental costs, some of which are subtle. Such widely ambiguous research results point to the need for theoretical research regarding capital regulation across diverse banking systems. Along the latter lines, consumer bank issues have been generally neglected. This paper theoretically examines the performance implications of misestimating the regulatory capital requirement for a stylised consumer bank. For our stylised consumer bank, we prove that misestimation, irrespective of its direction, results in lower economic profits and, hence, value. Conclusions and implications for future work are drawn. DEWEY : 001.424 ISSN : 0160-5682 En ligne : http://www.palgrave-journals.com/jors/journal/v61/n3/abs/jors2009109a.html Modelling LGD for unsecured personal loans / A. Matuszyk in Journal of the operational research society (JORS), Vol. 61 N° 3 (Mars 2010)
[article]
in Journal of the operational research society (JORS) > Vol. 61 N° 3 (Mars 2010) . - pp. 393–398
Titre : Modelling LGD for unsecured personal loans : decision tree approach Type de document : texte imprimé Auteurs : A. Matuszyk, Auteur ; C. Mues, Auteur ; L. C. Thomas, Auteur Année de publication : 2011 Article en page(s) : pp. 393–398 Note générale : Recherche opérationnelle Langues : Anglais (eng) Mots-clés : Basel II Consumer credit LGD Index. décimale : 001.424 Résumé : The New Basel Accord, which was implemented in 2007, has made a significant difference to the use of modelling within financial organisations. In particular it has highlighted the importance of Loss Given Default (LGD) modelling. We propose a decision tree approach to modelling LGD for unsecured consumer loans where the uncertainty in some of the nodes is modelled using a mixture model, where the parameters are obtained using regression. A case study based on default data from the in-house collections department of a UK financial organisation is used to show how such regression can be undertaken. DEWEY : 001.424 ISSN : 0160-5682 En ligne : http://www.palgrave-journals.com/jors/journal/v61/n3/abs/jors200967a.html [article] Modelling LGD for unsecured personal loans : decision tree approach [texte imprimé] / A. Matuszyk, Auteur ; C. Mues, Auteur ; L. C. Thomas, Auteur . - 2011 . - pp. 393–398.
Recherche opérationnelle
Langues : Anglais (eng)
in Journal of the operational research society (JORS) > Vol. 61 N° 3 (Mars 2010) . - pp. 393–398
Mots-clés : Basel II Consumer credit LGD Index. décimale : 001.424 Résumé : The New Basel Accord, which was implemented in 2007, has made a significant difference to the use of modelling within financial organisations. In particular it has highlighted the importance of Loss Given Default (LGD) modelling. We propose a decision tree approach to modelling LGD for unsecured consumer loans where the uncertainty in some of the nodes is modelled using a mixture model, where the parameters are obtained using regression. A case study based on default data from the in-house collections department of a UK financial organisation is used to show how such regression can be undertaken. DEWEY : 001.424 ISSN : 0160-5682 En ligne : http://www.palgrave-journals.com/jors/journal/v61/n3/abs/jors200967a.html Monte Carlo scenario generation for retail loan portfolios / J. L. Breeden in Journal of the operational research society (JORS), Vol. 61 N° 3 (Mars 2010)
[article]
in Journal of the operational research society (JORS) > Vol. 61 N° 3 (Mars 2010) . - pp. 399–410
Titre : Monte Carlo scenario generation for retail loan portfolios Type de document : texte imprimé Auteurs : J. L. Breeden, Auteur ; D. Ingram, Auteur Année de publication : 2011 Article en page(s) : pp. 399–410 Note générale : Recherche opérationnelle Langues : Anglais (eng) Mots-clés : Banking Econometrics Forecasting Risk Simulation Time series Index. décimale : 001.424 Résumé : Monte Carlo simulation is a common method for studying the volatility of market traded instruments. It is less employed in retail lending, because of the inherent nonlinearities in consumer behaviour. In this paper, we use the approach of Dual-time Dynamics to separate loan performance dynamics into three components: a maturation function of months-on-books, an exogenous function of calendar date, and a quality function of vintage origination date. The exogenous function captures the impacts from the macroeconomic environment. Therefore, we want to generate scenarios for the possible futures of these environmental impacts. To generate such scenarios, we must go beyond the random walk methods most commonly applied in the analysis of market-traded instruments. Retail portfolios exhibit autocorrelation structure and variance growth with time that requires more complex modelling. This paper is aimed at practical application and describes work using ARMA and ARIMA models for scenario generation, rules for selecting the correct model form given the input data, and validation methods on the scenario generation. We find when the goal is capturing the future volatility via Monte Carlo scenario generation, that model selection does not follow the same rules as for forecasting. Consequently, tests more appropriate to reproducing volatility are proposed, which assure that distributions of scenarios have the proper statistical characteristics. These results are supported by studies of the variance growth properties of macroeconomic variables and theoretical calculations of the variance growth properties of various models. We also provide studies on historical data showing the impact of training length on model accuracy and the existence of differences between macroeconomic epochs. DEWEY : 001.424 ISSN : 0160-5682 En ligne : http://www.palgrave-journals.com/jors/journal/v61/n3/abs/jors2009105a.html [article] Monte Carlo scenario generation for retail loan portfolios [texte imprimé] / J. L. Breeden, Auteur ; D. Ingram, Auteur . - 2011 . - pp. 399–410.
Recherche opérationnelle
Langues : Anglais (eng)
in Journal of the operational research society (JORS) > Vol. 61 N° 3 (Mars 2010) . - pp. 399–410
Mots-clés : Banking Econometrics Forecasting Risk Simulation Time series Index. décimale : 001.424 Résumé : Monte Carlo simulation is a common method for studying the volatility of market traded instruments. It is less employed in retail lending, because of the inherent nonlinearities in consumer behaviour. In this paper, we use the approach of Dual-time Dynamics to separate loan performance dynamics into three components: a maturation function of months-on-books, an exogenous function of calendar date, and a quality function of vintage origination date. The exogenous function captures the impacts from the macroeconomic environment. Therefore, we want to generate scenarios for the possible futures of these environmental impacts. To generate such scenarios, we must go beyond the random walk methods most commonly applied in the analysis of market-traded instruments. Retail portfolios exhibit autocorrelation structure and variance growth with time that requires more complex modelling. This paper is aimed at practical application and describes work using ARMA and ARIMA models for scenario generation, rules for selecting the correct model form given the input data, and validation methods on the scenario generation. We find when the goal is capturing the future volatility via Monte Carlo scenario generation, that model selection does not follow the same rules as for forecasting. Consequently, tests more appropriate to reproducing volatility are proposed, which assure that distributions of scenarios have the proper statistical characteristics. These results are supported by studies of the variance growth properties of macroeconomic variables and theoretical calculations of the variance growth properties of various models. We also provide studies on historical data showing the impact of training length on model accuracy and the existence of differences between macroeconomic epochs. DEWEY : 001.424 ISSN : 0160-5682 En ligne : http://www.palgrave-journals.com/jors/journal/v61/n3/abs/jors2009105a.html Modelling credit risk of portfolio of consumer loans / M. Malik in Journal of the operational research society (JORS), Vol. 61 N° 3 (Mars 2010)
[article]
in Journal of the operational research society (JORS) > Vol. 61 N° 3 (Mars 2010) . - pp. 411–420
Titre : Modelling credit risk of portfolio of consumer loans Type de document : texte imprimé Auteurs : M. Malik, Auteur ; L. C. Thomas, Auteur Année de publication : 2011 Article en page(s) : pp. 411–420 Note générale : Recherche opérationnelle Langues : Anglais (eng) Mots-clés : Finance Credit risk Survival analysis Credit scoring Index. décimale : 001.424 Résumé : One of the issues that the Basel Accord highlighted was that, though techniques for estimating the probability of default and hence the credit risk of loans to individual consumers are well established, there were no models for the credit risk of portfolios of such loans. Motivated by the reduced form models for credit risk in corporate lending, we seek to exploit the obvious parallels between behavioural scores and the ratings ascribed to corporate bonds to build consumer-lending equivalents. We incorporate both consumer-specific ratings and macroeconomic factors in the framework of Cox Proportional Hazard models. Our results show that default intensities of consumers are significantly influenced by macro factors. Such models then can be used as the basis for simulation approaches to estimate the credit risk of portfolios of consumer loans. DEWEY : 001.424 ISSN : 0160-5682 En ligne : http://www.palgrave-journals.com/jors/journal/v61/n3/abs/jors2009123a.html [article] Modelling credit risk of portfolio of consumer loans [texte imprimé] / M. Malik, Auteur ; L. C. Thomas, Auteur . - 2011 . - pp. 411–420.
Recherche opérationnelle
Langues : Anglais (eng)
in Journal of the operational research society (JORS) > Vol. 61 N° 3 (Mars 2010) . - pp. 411–420
Mots-clés : Finance Credit risk Survival analysis Credit scoring Index. décimale : 001.424 Résumé : One of the issues that the Basel Accord highlighted was that, though techniques for estimating the probability of default and hence the credit risk of loans to individual consumers are well established, there were no models for the credit risk of portfolios of such loans. Motivated by the reduced form models for credit risk in corporate lending, we seek to exploit the obvious parallels between behavioural scores and the ratings ascribed to corporate bonds to build consumer-lending equivalents. We incorporate both consumer-specific ratings and macroeconomic factors in the framework of Cox Proportional Hazard models. Our results show that default intensities of consumers are significantly influenced by macro factors. Such models then can be used as the basis for simulation approaches to estimate the credit risk of portfolios of consumer loans. DEWEY : 001.424 ISSN : 0160-5682 En ligne : http://www.palgrave-journals.com/jors/journal/v61/n3/abs/jors2009123a.html Scoring decisions in the context of economic uncertainty / K. Rajaratnam in Journal of the operational research society (JORS), Vol. 61 N° 3 (Mars 2010)
[article]
in Journal of the operational research society (JORS) > Vol. 61 N° 3 (Mars 2010) . - pp. 421–429
Titre : Scoring decisions in the context of economic uncertainty Type de document : texte imprimé Auteurs : K. Rajaratnam, Auteur ; P. Beling, Auteur ; G. Overstreet, Auteur Année de publication : 2011 Article en page(s) : pp. 421–429 Note générale : Recherche opérationnelle Langues : Anglais (eng) Mots-clés : Portfolio optimisation Decision-making under risk Risk measures Economic forecasts Index. décimale : 001.424 Résumé : We consider methods for incorporating forecasts of future economic conditions into acquisition decisions for scored retail credit and loan portfolios. We suppose that a portfolio manager is faced with two possible future economic scenarios, each characterised by a known probability of occurrence and by known performance functions that give expected profit and volume. We suppose further that he must choose in advance the scoring strategy and score cutoffs to optimise performance. We show that, despite the uncertainty of performance induced by economic conditions, every efficient policy consists of a single cutoff, provided the expected profit and volume performance curves in each scenario are concave. If these curves are not concave, efficient operating points can be characterised as cutoffs on a redefined score. In cases in which two scorecards are available, we show that it may be advantageous to randomly choose the scorecard to be employed, and we provide methods for selecting efficient operating points. Discussion is limited to cases with two scorecards and two economic scenarios, but our approach and results generalise to more scorecards and more economic scenarios. DEWEY : 001.424 ISSN : 0160-5682 En ligne : http://www.palgrave-journals.com/jors/journal/v61/n3/abs/jors200999a.html [article] Scoring decisions in the context of economic uncertainty [texte imprimé] / K. Rajaratnam, Auteur ; P. Beling, Auteur ; G. Overstreet, Auteur . - 2011 . - pp. 421–429.
Recherche opérationnelle
Langues : Anglais (eng)
in Journal of the operational research society (JORS) > Vol. 61 N° 3 (Mars 2010) . - pp. 421–429
Mots-clés : Portfolio optimisation Decision-making under risk Risk measures Economic forecasts Index. décimale : 001.424 Résumé : We consider methods for incorporating forecasts of future economic conditions into acquisition decisions for scored retail credit and loan portfolios. We suppose that a portfolio manager is faced with two possible future economic scenarios, each characterised by a known probability of occurrence and by known performance functions that give expected profit and volume. We suppose further that he must choose in advance the scoring strategy and score cutoffs to optimise performance. We show that, despite the uncertainty of performance induced by economic conditions, every efficient policy consists of a single cutoff, provided the expected profit and volume performance curves in each scenario are concave. If these curves are not concave, efficient operating points can be characterised as cutoffs on a redefined score. In cases in which two scorecards are available, we show that it may be advantageous to randomly choose the scorecard to be employed, and we provide methods for selecting efficient operating points. Discussion is limited to cases with two scorecards and two economic scenarios, but our approach and results generalise to more scorecards and more economic scenarios. DEWEY : 001.424 ISSN : 0160-5682 En ligne : http://www.palgrave-journals.com/jors/journal/v61/n3/abs/jors200999a.html Modelling take-up and profitability / P. Ma in Journal of the operational research society (JORS), Vol. 61 N° 3 (Mars 2010)
[article]
in Journal of the operational research society (JORS) > Vol. 61 N° 3 (Mars 2010) . - pp. 430–442
Titre : Modelling take-up and profitability Type de document : texte imprimé Auteurs : P. Ma, Auteur ; J. Crook, Auteur ; J. Ansell, Auteur Année de publication : 2011 Article en page(s) : pp. 430–442 Note générale : Recherche opérationnelle Langues : Anglais (eng) Mots-clés : Profitability Acceptance Credit Take-up Index. décimale : 001.424 Résumé : We use response data collected by a lender to estimate the probabilities of loan offers being accepted by the applicants and the survival probabilities of default and of paying back early. Combining all those together we estimated the expected profit surface for the lender at the time of application before making an offer to an applicant. The results show how a lender could find the optimal interest rate to increase the expected profit or its market share. We also consider how different optimal decision policies could be applied to different market segments. DEWEY : 001.424 ISSN : 0160-5682 En ligne : http://www.palgrave-journals.com/jors/journal/v61/n3/abs/jors200933a.html [article] Modelling take-up and profitability [texte imprimé] / P. Ma, Auteur ; J. Crook, Auteur ; J. Ansell, Auteur . - 2011 . - pp. 430–442.
Recherche opérationnelle
Langues : Anglais (eng)
in Journal of the operational research society (JORS) > Vol. 61 N° 3 (Mars 2010) . - pp. 430–442
Mots-clés : Profitability Acceptance Credit Take-up Index. décimale : 001.424 Résumé : We use response data collected by a lender to estimate the probabilities of loan offers being accepted by the applicants and the survival probabilities of default and of paying back early. Combining all those together we estimated the expected profit surface for the lender at the time of application before making an offer to an applicant. The results show how a lender could find the optimal interest rate to increase the expected profit or its market share. We also consider how different optimal decision policies could be applied to different market segments. DEWEY : 001.424 ISSN : 0160-5682 En ligne : http://www.palgrave-journals.com/jors/journal/v61/n3/abs/jors200933a.html Question selection responding to information on customers from heterogeneous populations to select offers that maximize expected profit / H-V. Seow in Journal of the operational research society (JORS), Vol. 61 N° 3 (Mars 2010)
[article]
in Journal of the operational research society (JORS) > Vol. 61 N° 3 (Mars 2010) . - pp. 443–454
Titre : Question selection responding to information on customers from heterogeneous populations to select offers that maximize expected profit Type de document : texte imprimé Auteurs : H-V. Seow, Auteur Année de publication : 2011 Article en page(s) : pp. 443–454 Note générale : Recherche opérationnelle Langues : Anglais (eng) Mots-clés : Credit scoring ‘Take’ rates Acceptance scoring Question selection Dynamic programming Bayesian model Index. décimale : 001.424 Résumé : The advent of Internet broking pages allows customers to ‘apply’ to a number of different companies at one time, leading to multiple offers made to a customer. The saturated condition of the personal financial products has led to falling ‘take’ rates. Financial institutions are trying to increase the ‘take’ rates of their personal financial products. Applicants for credit will have to provide information for risk assessment, which can be used to assess the probability of a customer accepting an offer. Interactive channels such as the Internet and telephone allow questions that are asked to depend on previous answers. The questions selected need to provide information to assess the probability of acceptance of a particular variant of financial product. In this paper, we investigate a model to predict the best offer to extend next to a customer based on the response for the questions, as well as the question selection itself. DEWEY : 001.424 ISSN : 0160-5682 En ligne : http://www.palgrave-journals.com/jors/journal/v61/n3/abs/jors20096a.html [article] Question selection responding to information on customers from heterogeneous populations to select offers that maximize expected profit [texte imprimé] / H-V. Seow, Auteur . - 2011 . - pp. 443–454.
Recherche opérationnelle
Langues : Anglais (eng)
in Journal of the operational research society (JORS) > Vol. 61 N° 3 (Mars 2010) . - pp. 443–454
Mots-clés : Credit scoring ‘Take’ rates Acceptance scoring Question selection Dynamic programming Bayesian model Index. décimale : 001.424 Résumé : The advent of Internet broking pages allows customers to ‘apply’ to a number of different companies at one time, leading to multiple offers made to a customer. The saturated condition of the personal financial products has led to falling ‘take’ rates. Financial institutions are trying to increase the ‘take’ rates of their personal financial products. Applicants for credit will have to provide information for risk assessment, which can be used to assess the probability of a customer accepting an offer. Interactive channels such as the Internet and telephone allow questions that are asked to depend on previous answers. The questions selected need to provide information to assess the probability of acceptance of a particular variant of financial product. In this paper, we investigate a model to predict the best offer to extend next to a customer based on the response for the questions, as well as the question selection itself. DEWEY : 001.424 ISSN : 0160-5682 En ligne : http://www.palgrave-journals.com/jors/journal/v61/n3/abs/jors20096a.html A new index of creditworthiness for retail credit products / L. Quirini in Journal of the operational research society (JORS), Vol. 61 N° 3 (Mars 2010)
[article]
in Journal of the operational research society (JORS) > Vol. 61 N° 3 (Mars 2010) . - pp. 455–461
Titre : A new index of creditworthiness for retail credit products Type de document : texte imprimé Auteurs : L. Quirini, Auteur ; L. Vannucci, Auteur Année de publication : 2011 Article en page(s) : pp. 455–461 Note générale : Recherche opérationnelle Langues : Anglais (eng) Mots-clés : Credit scoring Retail credit Profitability Creditworthiness Fixed term credit product Revolving credit product Index. décimale : 001.424 Résumé : This paper introduces a novel family of indexes to describe borrowers’ creditworthiness in retail credit products, both for fixed term loans and for open-ended products such as credit cards. Each index is the ratio at a given time of the net present value of actually received cashflows to the contractual ones. Some interpretations of the indexes are given and it is also described how to link them to the profitability of the credit financial operation. For open-ended products, a competing risks survival analysis methodology is proposed to estimate the cashflow returns and illustrated with a simulation. DEWEY : 001.424 ISSN : 0160-5682 En ligne : http://www.palgrave-journals.com/jors/journal/v61/n3/abs/jors200968a.html [article] A new index of creditworthiness for retail credit products [texte imprimé] / L. Quirini, Auteur ; L. Vannucci, Auteur . - 2011 . - pp. 455–461.
Recherche opérationnelle
Langues : Anglais (eng)
in Journal of the operational research society (JORS) > Vol. 61 N° 3 (Mars 2010) . - pp. 455–461
Mots-clés : Credit scoring Retail credit Profitability Creditworthiness Fixed term credit product Revolving credit product Index. décimale : 001.424 Résumé : This paper introduces a novel family of indexes to describe borrowers’ creditworthiness in retail credit products, both for fixed term loans and for open-ended products such as credit cards. Each index is the ratio at a given time of the net present value of actually received cashflows to the contractual ones. Some interpretations of the indexes are given and it is also described how to link them to the profitability of the credit financial operation. For open-ended products, a competing risks survival analysis methodology is proposed to estimate the cashflow returns and illustrated with a simulation. DEWEY : 001.424 ISSN : 0160-5682 En ligne : http://www.palgrave-journals.com/jors/journal/v61/n3/abs/jors200968a.html Likelihood-ratio changepoint features for consumer-behaviour models / A. R. Brentnall in Journal of the operational research society (JORS), Vol. 61 N° 3 (Mars 2010)
[article]
in Journal of the operational research society (JORS) > Vol. 61 N° 3 (Mars 2010) . - pp. 462–472
Titre : Likelihood-ratio changepoint features for consumer-behaviour models Type de document : texte imprimé Auteurs : A. R. Brentnall, Auteur ; M. J. Crowder, Auteur ; D. J. Hand, Auteur Année de publication : 2011 Article en page(s) : pp. 462–472 Note générale : Recherche opérationnelle Langues : Anglais (eng) Mots-clés : Automated teller machine Consumer Finance Prediction Risk Statistics Index. décimale : 001.424 Résumé : Some predictive models for customer value management might benefit from information about certain changes in individual-consumer behaviour. We take changepoint methods as the first step in producing a model-input feature for this purpose. An unusual feature in the application of changepoint methods to consumer data is there are as many streams of data as there are customers. This property is used to help decide whether an individual has changed their behaviour by ordering likelihood-ratio statistics from the changepoint models. Following a review of changepoint methods, the approach is demonstrated on cash machine transactions. Models for the amount, location and time of transaction are used and accounts exhibiting large evidence of change are examined in detail. For the data set used the approach performs sensibly. The worth of likelihood-ratio statistics to rank evidence for change is considered more generally through some of the literature. DEWEY : 001.424 ISSN : 0160-5682 En ligne : http://www.palgrave-journals.com/jors/journal/v61/n3/abs/jors2009160a.html [article] Likelihood-ratio changepoint features for consumer-behaviour models [texte imprimé] / A. R. Brentnall, Auteur ; M. J. Crowder, Auteur ; D. J. Hand, Auteur . - 2011 . - pp. 462–472.
Recherche opérationnelle
Langues : Anglais (eng)
in Journal of the operational research society (JORS) > Vol. 61 N° 3 (Mars 2010) . - pp. 462–472
Mots-clés : Automated teller machine Consumer Finance Prediction Risk Statistics Index. décimale : 001.424 Résumé : Some predictive models for customer value management might benefit from information about certain changes in individual-consumer behaviour. We take changepoint methods as the first step in producing a model-input feature for this purpose. An unusual feature in the application of changepoint methods to consumer data is there are as many streams of data as there are customers. This property is used to help decide whether an individual has changed their behaviour by ordering likelihood-ratio statistics from the changepoint models. Following a review of changepoint methods, the approach is demonstrated on cash machine transactions. Models for the amount, location and time of transaction are used and accounts exhibiting large evidence of change are examined in detail. For the data set used the approach performs sensibly. The worth of likelihood-ratio statistics to rank evidence for change is considered more generally through some of the literature. DEWEY : 001.424 ISSN : 0160-5682 En ligne : http://www.palgrave-journals.com/jors/journal/v61/n3/abs/jors2009160a.html Reject inference in survival analysis by augmentation / J. Banasik in Journal of the operational research society (JORS), Vol. 61 N° 3 (Mars 2010)
[article]
in Journal of the operational research society (JORS) > Vol. 61 N° 3 (Mars 2010) . - pp. 473–485
Titre : Reject inference in survival analysis by augmentation Type de document : texte imprimé Auteurs : J. Banasik, Auteur ; J. Crook, Auteur Année de publication : 2011 Article en page(s) : pp. 473–485 Note générale : Recherche opérationnelle Langues : Anglais (eng) Mots-clés : Credit scoring Reject inference Augmentation Survival analysis Proportional hazard Index. décimale : 001.424 Résumé : The literature suggests that the commonly used augmentation method of reject inference achieves no appreciable benefit in the context of logistic and probit regression models. Ranking is not improved and the ability to discern a correct cut-off is undermined. This paper considers the application of augmentation to profit scoring applicants by means of survival analysis and by the Cox proportional hazard model, in particular. This new context involves more elaborate models answering more specific questions such as when will default occur and what will be its precise financial implication. Also considered in this paper is the extent to which the rejection rate is critical in the potential usefulness of reject inference and how augmentation meets that potential. The conclusion is essentially that augmentation achieves negative benefits only and that the scope for reject inference in this context pertains mainly to circumstances where a high proportion of applicants have been rejected. DEWEY : 001.424 ISSN : 0160-5682 En ligne : http://www.palgrave-journals.com/jors/journal/v61/n3/abs/jors2008180a.html [article] Reject inference in survival analysis by augmentation [texte imprimé] / J. Banasik, Auteur ; J. Crook, Auteur . - 2011 . - pp. 473–485.
Recherche opérationnelle
Langues : Anglais (eng)
in Journal of the operational research society (JORS) > Vol. 61 N° 3 (Mars 2010) . - pp. 473–485
Mots-clés : Credit scoring Reject inference Augmentation Survival analysis Proportional hazard Index. décimale : 001.424 Résumé : The literature suggests that the commonly used augmentation method of reject inference achieves no appreciable benefit in the context of logistic and probit regression models. Ranking is not improved and the ability to discern a correct cut-off is undermined. This paper considers the application of augmentation to profit scoring applicants by means of survival analysis and by the Cox proportional hazard model, in particular. This new context involves more elaborate models answering more specific questions such as when will default occur and what will be its precise financial implication. Also considered in this paper is the extent to which the rejection rate is critical in the potential usefulness of reject inference and how augmentation meets that potential. The conclusion is essentially that augmentation achieves negative benefits only and that the scope for reject inference in this context pertains mainly to circumstances where a high proportion of applicants have been rejected. DEWEY : 001.424 ISSN : 0160-5682 En ligne : http://www.palgrave-journals.com/jors/journal/v61/n3/abs/jors2008180a.html Effects of missing data in credit risk scoring / R. Florez-Lopez in Journal of the operational research society (JORS), Vol. 61 N° 3 (Mars 2010)
[article]
in Journal of the operational research society (JORS) > Vol. 61 N° 3 (Mars 2010) . - pp. 486–501
Titre : Effects of missing data in credit risk scoring : a comparative analysis of methods to achieve robustness in the absence of sufficient data Type de document : texte imprimé Auteurs : R. Florez-Lopez, Auteur Année de publication : 2011 Article en page(s) : pp. 486–501 Note générale : Recherche opérationnelle Langues : Anglais (eng) Mots-clés : Banking Credit risk Management Forecasting Missing data Scarce data Index. décimale : 001.424 Résumé : The 2004 Basel II Accord has pointed out the benefits of credit risk management through internal models using internal data to estimate risk components: probability of default (PD), loss given default, exposure at default and maturity. Internal data are the primary data source for PD estimates; banks are permitted to use statistical default prediction models to estimate the borrowers’ PD, subject to some requirements concerning accuracy, completeness and appropriateness of data. However, in practice, internal records are usually incomplete or do not contain adequate history to estimate the PD. Current missing data are critical with regard to low default portfolios, characterised by inadequate default records, making it difficult to design statistically significant prediction models. Several methods might be used to deal with missing data such as list-wise deletion, application-specific list-wise deletion, substitution techniques or imputation models (simple and multiple variants). List-wise deletion is an easy-to-use method widely applied by social scientists, but it loses substantial data and reduces the diversity of information resulting in a bias in the model's parameters, results and inferences. The choice of the best method to solve the missing data problem largely depends on the nature of missing values (MCAR, MAR and MNAR processes) but there is a lack of empirical analysis about their effect on credit risk that limits the validity of resulting models. In this paper, we analyse the nature and effects of missing data in credit risk modelling (MCAR, MAR and NMAR processes) and take into account current scarce data set on consumer borrowers, which include different percents and distributions of missing data. The findings are used to analyse the performance of several methods for dealing with missing data such as likewise deletion, simple imputation methods, MLE models and advanced multiple imputation (MI) alternatives based on MarkovChain-MonteCarlo and re-sampling methods. Results are evaluated and discussed between models in terms of robustness, accuracy and complexity. In particular, MI models are found to provide very valuable solutions with regard to credit risk missing data. DEWEY : 001.424 ISSN : 160-5682 En ligne : http://www.palgrave-journals.com/jors/journal/v61/n3/abs/jors200966a.html [article] Effects of missing data in credit risk scoring : a comparative analysis of methods to achieve robustness in the absence of sufficient data [texte imprimé] / R. Florez-Lopez, Auteur . - 2011 . - pp. 486–501.
Recherche opérationnelle
Langues : Anglais (eng)
in Journal of the operational research society (JORS) > Vol. 61 N° 3 (Mars 2010) . - pp. 486–501
Mots-clés : Banking Credit risk Management Forecasting Missing data Scarce data Index. décimale : 001.424 Résumé : The 2004 Basel II Accord has pointed out the benefits of credit risk management through internal models using internal data to estimate risk components: probability of default (PD), loss given default, exposure at default and maturity. Internal data are the primary data source for PD estimates; banks are permitted to use statistical default prediction models to estimate the borrowers’ PD, subject to some requirements concerning accuracy, completeness and appropriateness of data. However, in practice, internal records are usually incomplete or do not contain adequate history to estimate the PD. Current missing data are critical with regard to low default portfolios, characterised by inadequate default records, making it difficult to design statistically significant prediction models. Several methods might be used to deal with missing data such as list-wise deletion, application-specific list-wise deletion, substitution techniques or imputation models (simple and multiple variants). List-wise deletion is an easy-to-use method widely applied by social scientists, but it loses substantial data and reduces the diversity of information resulting in a bias in the model's parameters, results and inferences. The choice of the best method to solve the missing data problem largely depends on the nature of missing values (MCAR, MAR and MNAR processes) but there is a lack of empirical analysis about their effect on credit risk that limits the validity of resulting models. In this paper, we analyse the nature and effects of missing data in credit risk modelling (MCAR, MAR and NMAR processes) and take into account current scarce data set on consumer borrowers, which include different percents and distributions of missing data. The findings are used to analyse the performance of several methods for dealing with missing data such as likewise deletion, simple imputation methods, MLE models and advanced multiple imputation (MI) alternatives based on MarkovChain-MonteCarlo and re-sampling methods. Results are evaluated and discussed between models in terms of robustness, accuracy and complexity. In particular, MI models are found to provide very valuable solutions with regard to credit risk missing data. DEWEY : 001.424 ISSN : 160-5682 En ligne : http://www.palgrave-journals.com/jors/journal/v61/n3/abs/jors200966a.html A novel maximum dispersion territory design model arising in the implementation of the WEEE-directive / E. Fernández in Journal of the operational research society (JORS), Vol. 61 N° 3 (Mars 2010)
[article]
in Journal of the operational research society (JORS) > Vol. 61 N° 3 (Mars 2010) . - pp. 503–514
Titre : A novel maximum dispersion territory design model arising in the implementation of the WEEE-directive Type de document : texte imprimé Auteurs : E. Fernández, Auteur ; J. Kalcsics, Auteur ; S. Nickel, Auteur Année de publication : 2011 Article en page(s) : pp. 503–514 Note générale : Recherche opérationnelle Langues : Anglais (eng) Mots-clés : Heuristics Optimization Logistics Recycling Territory design Index. décimale : 001.424 Résumé : The problem discussed in this paper is motivated by the new recycling directive Waste Electrical and Electronic Equipment of the European Commission. The core of this law is, that each company which sells electrical or electronic equipment in a European country has the obligation to recollect and recycle an amount of returned items which is proportional to its market share. To assign collection stations to companies, in Germany for one product type a territory design approach is planned. However, in contrast to classical territory design, the territories should be geographically as dispersed as possible to avoid that a company, respectively its logistics provider responsible for the recollection, gains a monopoly in some region. First, we identify an appropriate measure for the dispersion of a territory. Afterwards, we present a first mathematical programming model for this new problem as well as a solution method based on the Greedy Randomized Adaptive Search Procedure methodology. Extensive computational results illustrate the suitability of the model and assess the effectiveness of the heuristic. DEWEY : 001.424 ISSN : 0160-5682 En ligne : http://www.palgrave-journals.com/jors/journal/v61/n3/abs/jors200970a.html [article] A novel maximum dispersion territory design model arising in the implementation of the WEEE-directive [texte imprimé] / E. Fernández, Auteur ; J. Kalcsics, Auteur ; S. Nickel, Auteur . - 2011 . - pp. 503–514.
Recherche opérationnelle
Langues : Anglais (eng)
in Journal of the operational research society (JORS) > Vol. 61 N° 3 (Mars 2010) . - pp. 503–514
Mots-clés : Heuristics Optimization Logistics Recycling Territory design Index. décimale : 001.424 Résumé : The problem discussed in this paper is motivated by the new recycling directive Waste Electrical and Electronic Equipment of the European Commission. The core of this law is, that each company which sells electrical or electronic equipment in a European country has the obligation to recollect and recycle an amount of returned items which is proportional to its market share. To assign collection stations to companies, in Germany for one product type a territory design approach is planned. However, in contrast to classical territory design, the territories should be geographically as dispersed as possible to avoid that a company, respectively its logistics provider responsible for the recollection, gains a monopoly in some region. First, we identify an appropriate measure for the dispersion of a territory. Afterwards, we present a first mathematical programming model for this new problem as well as a solution method based on the Greedy Randomized Adaptive Search Procedure methodology. Extensive computational results illustrate the suitability of the model and assess the effectiveness of the heuristic. DEWEY : 001.424 ISSN : 0160-5682 En ligne : http://www.palgrave-journals.com/jors/journal/v61/n3/abs/jors200970a.html Vehicle routing and scheduling with time-varying data / W. Maden in Journal of the operational research society (JORS), Vol. 61 N° 3 (Mars 2010)
[article]
in Journal of the operational research society (JORS) > Vol. 61 N° 3 (Mars 2010) . - pp. 515–522
Titre : Vehicle routing and scheduling with time-varying data : a case study Type de document : texte imprimé Auteurs : W. Maden, Auteur ; R. Eglese, Auteur ; D. Black, Auteur Année de publication : 2011 Article en page(s) : pp. 515–522 Note générale : Recherche opérrationnelle Langues : Anglais (eng) Mots-clés : Vehicle routing Distribution Heuristics Environment Index. décimale : 001.424 Résumé : A heuristic algorithm is described for vehicle routing and scheduling problems to minimise the total travel time, where the time required for a vehicle to travel along any road in the network varies according to the time of travel. The variation is caused by congestion that is typically greatest during morning and evening rush hours. The algorithm is used to schedule a fleet of delivery vehicles operating in the South West of the United Kingdom for a sample of days. The results demonstrate how conventional methods that do not take time-varying speeds into account when planning, except for an overall contingency allowance, may still lead to some routes taking too long. The results are analysed to show that in the case study using the proposed approach can lead to savings in CO2 emissions of about 7%. DEWEY : 001.424 ISSN : 0160-5682 En ligne : http://www.palgrave-journals.com/jors/journal/v61/n3/abs/jors2009116a.html [article] Vehicle routing and scheduling with time-varying data : a case study [texte imprimé] / W. Maden, Auteur ; R. Eglese, Auteur ; D. Black, Auteur . - 2011 . - pp. 515–522.
Recherche opérrationnelle
Langues : Anglais (eng)
in Journal of the operational research society (JORS) > Vol. 61 N° 3 (Mars 2010) . - pp. 515–522
Mots-clés : Vehicle routing Distribution Heuristics Environment Index. décimale : 001.424 Résumé : A heuristic algorithm is described for vehicle routing and scheduling problems to minimise the total travel time, where the time required for a vehicle to travel along any road in the network varies according to the time of travel. The variation is caused by congestion that is typically greatest during morning and evening rush hours. The algorithm is used to schedule a fleet of delivery vehicles operating in the South West of the United Kingdom for a sample of days. The results demonstrate how conventional methods that do not take time-varying speeds into account when planning, except for an overall contingency allowance, may still lead to some routes taking too long. The results are analysed to show that in the case study using the proposed approach can lead to savings in CO2 emissions of about 7%. DEWEY : 001.424 ISSN : 0160-5682 En ligne : http://www.palgrave-journals.com/jors/journal/v61/n3/abs/jors2009116a.html Reducing fuel emissions by optimizing speed on shipping routes / K. Fagerholt in Journal of the operational research society (JORS), Vol. 61 N° 3 (Mars 2010)
[article]
in Journal of the operational research society (JORS) > Vol. 61 N° 3 (Mars 2010) . - pp. 523–529
Titre : Reducing fuel emissions by optimizing speed on shipping routes Type de document : texte imprimé Auteurs : K. Fagerholt, Auteur ; G. Laporte, Auteur ; I. Norstad, Auteur Année de publication : 2011 Article en page(s) : pp. 523–529 Note générale : Recherche opérationnelle Langues : Anglais (eng) Mots-clés : Shipping routes Fuel emissions Speed optimization Shortest paths Index. décimale : 001.424 Résumé : Fuel consumption and emissions on a shipping route are typically a cubic function of speed. Given a shipping route consisting of a sequence of ports with a time window for the start of service, substantial savings can be achieved by optimizing the speed of each leg. This problem is cast as a non-linear continuous program, which can be solved by a non-linear programming solver. We propose an alternative solution methodology, in which the arrival times are discretized and the problem is solved as a shortest path problem on a directed acyclic graph. Extensive computational results confirm the superiority of the shortest path approach and the potential for fuel savings on shipping routes. DEWEY : 001.424 ISSN : 0160-5682 En ligne : http://www.palgrave-journals.com/jors/journal/v61/n3/abs/jors200977a.html [article] Reducing fuel emissions by optimizing speed on shipping routes [texte imprimé] / K. Fagerholt, Auteur ; G. Laporte, Auteur ; I. Norstad, Auteur . - 2011 . - pp. 523–529.
Recherche opérationnelle
Langues : Anglais (eng)
in Journal of the operational research society (JORS) > Vol. 61 N° 3 (Mars 2010) . - pp. 523–529
Mots-clés : Shipping routes Fuel emissions Speed optimization Shortest paths Index. décimale : 001.424 Résumé : Fuel consumption and emissions on a shipping route are typically a cubic function of speed. Given a shipping route consisting of a sequence of ports with a time window for the start of service, substantial savings can be achieved by optimizing the speed of each leg. This problem is cast as a non-linear continuous program, which can be solved by a non-linear programming solver. We propose an alternative solution methodology, in which the arrival times are discretized and the problem is solved as a shortest path problem on a directed acyclic graph. Extensive computational results confirm the superiority of the shortest path approach and the potential for fuel savings on shipping routes. DEWEY : 001.424 ISSN : 0160-5682 En ligne : http://www.palgrave-journals.com/jors/journal/v61/n3/abs/jors200977a.html Minimizing greenhouse gas emissions in intermodal freight transport / J. Bauer in Journal of the operational research society (JORS), Vol. 61 N° 3 (Mars 2010)
[article]
in Journal of the operational research society (JORS) > Vol. 61 N° 3 (Mars 2010) . - pp. 530–542
Titre : Minimizing greenhouse gas emissions in intermodal freight transport : an application to rail service design Type de document : texte imprimé Auteurs : J. Bauer, Auteur ; T. Bektas, Auteur ; Crainic, Auteur Année de publication : 2011 Article en page(s) : pp. 530–542 Note générale : Recherche opérationnelle Langues : Anglais (eng) Mots-clés : Green logistics Greenhouse gas emissions Intermodal freight transport Scheduled service network design Space time network Multicommodity network design Index. décimale : 001.424 Résumé : Freight transport has undesirable effects on the environment. The most prominent of these is greenhouse gas emissions. Intermodal freight transport, where freight is shipped from origin to destination by a sequence of at least two transportation modes, offers the possibility of shifting freight (either partially or in full) from one mode to another in the hope of reducing the greenhouse emissions by appropriately scheduling the services and routing the freight. Traditional planning methods for scheduling services in an intermodal transportation network usually focus on minimizing travel or time-related costs of transport. This article breaks away from such an approach by addressing the issue of incorporating environment-related costs (greenhouse gases, to be specific) into freight transportation planning and proposes an integer program in the form of a linear cost, multicommodity, capacitated network design formulation that minimizes the amount of greenhouse gas emissions of transportation activities. Computational results based on an application of the proposed approach on a real-life rail freight transportation network are presented. DEWEY : 001.424 ISSN : 0160-5682 En ligne : http://www.palgrave-journals.com/jors/journal/v61/n3/abs/jors2009102a.html [article] Minimizing greenhouse gas emissions in intermodal freight transport : an application to rail service design [texte imprimé] / J. Bauer, Auteur ; T. Bektas, Auteur ; Crainic, Auteur . - 2011 . - pp. 530–542.
Recherche opérationnelle
Langues : Anglais (eng)
in Journal of the operational research society (JORS) > Vol. 61 N° 3 (Mars 2010) . - pp. 530–542
Mots-clés : Green logistics Greenhouse gas emissions Intermodal freight transport Scheduled service network design Space time network Multicommodity network design Index. décimale : 001.424 Résumé : Freight transport has undesirable effects on the environment. The most prominent of these is greenhouse gas emissions. Intermodal freight transport, where freight is shipped from origin to destination by a sequence of at least two transportation modes, offers the possibility of shifting freight (either partially or in full) from one mode to another in the hope of reducing the greenhouse emissions by appropriately scheduling the services and routing the freight. Traditional planning methods for scheduling services in an intermodal transportation network usually focus on minimizing travel or time-related costs of transport. This article breaks away from such an approach by addressing the issue of incorporating environment-related costs (greenhouse gases, to be specific) into freight transportation planning and proposes an integer program in the form of a linear cost, multicommodity, capacitated network design formulation that minimizes the amount of greenhouse gas emissions of transportation activities. Computational results based on an application of the proposed approach on a real-life rail freight transportation network are presented. DEWEY : 001.424 ISSN : 0160-5682 En ligne : http://www.palgrave-journals.com/jors/journal/v61/n3/abs/jors2009102a.html
Exemplaires
Code-barres | Cote | Support | Localisation | Section | Disponibilité |
---|---|---|---|---|---|
aucun exemplaire |