Skip to main content
edited tags
Link
Nike Dattani
  • 38.6k
  • 4
  • 109
  • 267

How is performance estimation What does ns/day mean in MD simulations measuredhigh-performance computing?

added 1 character in body; edited title
Source Link
Nike Dattani
  • 38.6k
  • 4
  • 109
  • 267

How is performance estimation in MD simulations is measured?

In MD simulation benchmarking, you often see performance expressed in terms of ns/day and hours/ns.

How do I translate thisthese values to the amount of real-time a simulation takes? In other words, what would be a reasonable value of ns/day in order to run standard MD simulations using GPUs?

How performance estimation in MD simulations is measured?

In MD simulation benchmarking, you often see performance expressed in terms of ns/day and hours/ns.

How do I translate this values to the amount of real-time a simulation takes? In other words, what would be a reasonable value of ns/day in order to run standard MD simulations using GPUs?

How is performance estimation in MD simulations measured?

In MD simulation benchmarking, you often see performance expressed in terms of ns/day and hours/ns.

How do I translate these values to the amount of real-time a simulation takes? In other words, what would be a reasonable value of ns/day in order to run standard MD simulations using GPUs?

tried to tie the two questions into one core idea
Source Link
Tyberius
  • 16.2k
  • 4
  • 34
  • 125

In MD simulation benchmarking, you often hear the termsee performance expressed in terms of ns/day and hours/ns.

  1. How does it convert to human understandable time like: minute, hour, days?
  2. What is the usual ns/day for standard MD simulation using GPUs?

How do I translate this values to the amount of real-time a simulation takes? In other words, what would be a reasonable value of ns/day in order to run standard MD simulations using GPUs?

In MD simulation benchmarking you often hear the term ns/day and hours/ns.

  1. How does it convert to human understandable time like: minute, hour, days?
  2. What is the usual ns/day for standard MD simulation using GPUs?

In MD simulation benchmarking, you often see performance expressed in terms of ns/day and hours/ns.

How do I translate this values to the amount of real-time a simulation takes? In other words, what would be a reasonable value of ns/day in order to run standard MD simulations using GPUs?

Became Hot Network Question
Loading
Redid the title
Source Link
PBH
  • 2.7k
  • 2
  • 7
  • 33
Loading
Source Link
littleworth
  • 1.7k
  • 10
  • 29
Loading