免费注册 查看新帖 |

Chinaunix

  平台 论坛 博客 文库
最近访问板块 发新帖
查看: 1212 | 回复: 0
打印 上一主题 下一主题

SEACORN::网络仿真器比较 [复制链接]

论坛徽章:
0
跳转到指定楼层
1 [收藏(0)] [报告]
发表于 2009-04-03 23:33 |只看该作者 |倒序浏览
There are numerous network simulation tools on the market today, both
commercial and non-commercial. For example ns2 and its contributed
extensions and models, OMNeT++, TANGRAM-II, Parsec, SMURPH, Ptolemy,
NetSim++, C++SIM, CLASS, ANGLES, GloMoSim for ad-hock and wireless
networks, and DASSF/ SSFNET for large networks are commonly used
non-commercial simulators. OPNET, QualNet and COMNET III are typical
commercial tools. In the commercial category, OPNET is widely held to
be the state of the art in network simulation. ns2 is considered the
most popular among the non commercial simulation tools. OMNeT++ is
targeted at roughly the same segment of network simulation as OPNET.
The objective is to reuse any suitable software, or selection of
software, that is well acknowledged and tested. The selection of the
software tool (s) to be used in the project will be decided in the
beginning of the project by the consortium members.
Validation and assessment is a hard 'nut to crack' even for existing,
widely used (integrated) commercial and non-commercial simulation
tools. The complexity of the protocols and the level of abstraction
required are such that make it a very difficult task. The experience of
the partners, for example with OPNET and with ns2 (also evidenced
through discussion lists) bring to light the fact that even well tested
simulation software can still have 'bugs' for the seemingly well tested
TCP/IP protocol suite (IPv4), either due to the code or the protocol
interpretation. (A recent study has identified more than 400 different
implementations and versions of the TCP/IP stack). The tradeoffs
between simulation tractability and oversimplification of results is
not an easy task. Striking the right balance between the fundamental
underlying dynamics in packet behaviour in the Enhanced UMTS
environment and specific performance in particular environments and
specific protocols in a future scenario is by no means an easy task.
A recent paper by Sally Floyd and Vern Paxon, 'Difficulties in
Simulating the Internet', IEEE/ACM transactions in Networking, Vol. 9,
No. 4, August 2001, highlights above difficulties. In the abstract it
is clearly stated that simulating how the global Internet (a live,
running network) behaves is an extremely challenging undertaking
because of the network's great heterogeneity and rapid change. One of
the biggest problems in simulating the Internet is the lack of typical
'operational' data or traces (the Internet traffic has proven to be
rapidly evolving and dynamic), which can be used to validate the
simulator behaviour. This is expected to worsen, as the current
networking paradigm is expected to be different from a future
networking environment, based on high speed wired and wireless access
(especially for Enhanced UMTS), supported by a new set of protocols and
a rich set of (new) applications. Measurement and experimentation with
current Internet traffic have limitations; they can possibly be used to
explore particular new environments, but not new architectures and
applications for the future Internet. Adding mobility and high-speed
wireless networking increases the complexity further since radio
propagation and node mobility are difficult and expensive to model, and
lack of experience with networking scenario in such networks.
Another recent paper, by J. Heidemann, K. Mills, S. Kumar,
'Expanding Confidence in Network Simulations', IEEE Networks,
September/October 2001, discusses best practices for validating
simulations and for validating TCP models across various simulation
environments. Despite the almost universal use of simulation to predict
performance of complex networks, and to understand the behavior of
existing and the correctness and performance of new protocol designs,
there are no widely accepted practices and techniques to help validate
network simulations and to evaluate the trustworthiness of their
results. Validation is the process of assuring that a model provides
meaningful, trustworthy, answers to the questions being investigated,
in accordance with real world behavior. As models often involve
approximations or abstractions from reality, validation provides
confidence that these approximations do not substantially alter the
answers to the questions being posed. Furthermore, different situations
require different levels of validation; the level of validation
required for a network simulation is influenced by the questions being
asked and by the systems being used. The paper discusses issues (and
difficulties) that need to be considered when validating network
simulations. These include:
  • Establishment of 'ground truth' and comparison with simulator results with direct comparison with reality, whenever possible.
  • Clarity of specifications vs implementation. Some protocol
        parameter settings (e.g. window size) may affect performance
        substantially (e.g. steady state throughput changes by a factor of 2-10
        has been reported).
  • Comparing simulations as protocol designs evolve. As the Internet
        is dynamic and protocol designs evolve, care must be exercised when
        comparing specific implementations and in the way general conclusions
        are drawn.
  • Comparing simulations as network traffic changes. Validation
        against yesterdays or today's traffic mix, may not have much to do with
        the future networking traffic patterns and scenarios.
  • Choosing appropriate metrics for comparisons. Given the 'ground
        truth' metrics must be derived to compare simulation results against
        the truth. Evaluating against the full real world behavior is not
        practical, thus often specific aspects are evaluated. Generalising from
        the specifics is still an open research question.
  • Evaluating the sensitivity of simulations. Once validation has been
        performed under one set of conditions, sensitivity analysis helps
        understand how varying configurations change the accuracy of the
        simulation.
  • Large scale simulation and validation. Two complimentary approaches
        to large scale simulation are widely adopted: parallel executions and
        abstraction. Also hierarchical (recursive) composition, build from
        well-validated components, and a well validated framework can be
        usefully employed.
  • Incorporating mathematical models as subsystems in discrete event simulations can greatly help.
  • Assessing cost-benefit tradeoffs. The extent, and therefore the
        cost of validation must be considered against the likely benefits.

For validation and assessment, in SEACORN we plan to include a selection of complimentary approaches, as for example:
  • Clearly and explicitly document the assumptions, abstractions, and limitations.
  • Identify the dependencies between any assumptions and
        simplifications, thus allowing for the error to be controlled and taken
        into account.
  • Identify and explore the dangers and pitfalls in modelling and simulating Enhanced UMTS.
  • Whenever possible, complement and compare simulation results with
        models, analysis, measurements, and experiments keeping in mind the
        limitations of each method.
  • Review of existing literature on methodologies and best practices with regard to validation and assessment.
  • Define metrics and criteria for assessing and validating the simulation software.
  • Construct confidence intervals so that the reliability of the simulation results can be identified.
  • Design in as many means as possible for examining the state of simulation (including animations where practicable).
  • Identifying any behaviour that has been shown empirically to hold
        in a very wide range of environments (Invariants in behaviour) and
        backing these with a sound theoretical analysis. Caution will be
        exercised as to the validity of the 'invariant' in a future networking
        scenario in Enhanced UMTS. Development of heuristics for evaluation of
        Enhanced UMTS network performance.
  • Identify and explore a representative parameter space (sensitivity analysis) of behaviour.
  • Design validation tests that will include standards tests and run
        these validation tests regularly, including for each simulator release.
  • Use reference scenarios, as for example those defined by 3GPP in TR25.942 and TR101.112.
  • Ensure that simulation results are reproducible. Care is required
        in generating pseudo-random numbers sequences, mitigation of rounding
        due to floating point representation errors, which may affect event
        concurrency (especially when using parallel simulation).
  • Make simulation scripts and simulation scenario publicly available,
        so that other researchers can easily check for themselves (and
        validate) the effect of changing network conditions and assumptions.



               
               
               

本文来自ChinaUnix博客,如果查看原文请点:http://blog.chinaunix.net/u/22344/showart_1889998.html
您需要登录后才可以回帖 登录 | 注册

本版积分规则 发表回复

  

北京盛拓优讯信息技术有限公司. 版权所有 京ICP备16024965号-6 北京市公安局海淀分局网监中心备案编号:11010802020122 niuxiaotong@pcpop.com 17352615567
未成年举报专区
中国互联网协会会员  联系我们:huangweiwei@itpub.net
感谢所有关心和支持过ChinaUnix的朋友们 转载本站内容请注明原作者名及出处

清除 Cookies - ChinaUnix - Archiver - WAP - TOP