2012
DOI: 10.5047/eps.2011.05.035
|View full text |Cite
|
Sign up to set email alerts
|

Effect of ion cyclotron motion on the structure of wakes: A Vlasov simulation

Abstract: The global structure of wake fields behind an unmagnetized object in the solar wind is studied by means of a 2.5-dimensional full electromagnetic Vlasov simulation. The interaction of a plasma flow with an unmagnetized object is quite different from that with a magnetized object such as the Earth. Due to the absence of the global magnetic field, the unmagnetized object absorbs plasma particles which reach the surface, generating a plasma cavity called a wake on the anti-solar side of the object. The interactio… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
11
0

Year Published

2012
2012
2015
2015

Publication Types

Select...
3
3
1

Relationship

3
4

Authors

Journals

citations
Cited by 17 publications
(11 citation statements)
references
References 19 publications
0
11
0
Order By: Relevance
“…As results, we obtained the best computational speed of 8.19 TFlops (performance efficiency of 10.9%) with 8192 cores on the HA8000, 2.95 TFlops (14.6%) with 2304 cores on the R815, 3.17 TFlops (15.5%) with 2048 cores on the HX600, 4.10 TFlops (13.3%) with 3072 cores on the FX1, and 255 GFlops (10.6%) with 128 cores on the SR16000/L2. Note that we have applied the present parallel 2P2V and 2P3V Vlasov-Maxwell solvers to the K-H instability [3], magnetic reconnection [6], and the interaction between solar/stellar winds and dielectric bodies [7], [8]. Currently, 256-1024 cores are used for the parallel computations.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…As results, we obtained the best computational speed of 8.19 TFlops (performance efficiency of 10.9%) with 8192 cores on the HA8000, 2.95 TFlops (14.6%) with 2304 cores on the R815, 3.17 TFlops (15.5%) with 2048 cores on the HX600, 4.10 TFlops (13.3%) with 3072 cores on the FX1, and 255 GFlops (10.6%) with 128 cores on the SR16000/L2. Note that we have applied the present parallel 2P2V and 2P3V Vlasov-Maxwell solvers to the K-H instability [3], magnetic reconnection [6], and the interaction between solar/stellar winds and dielectric bodies [7], [8]. Currently, 256-1024 cores are used for the parallel computations.…”
Section: Discussionmentioning
confidence: 99%
“…Fig. 4(c) shows the magnetosphere of a small dielectric body obtained by the 2P3V Vlasov simulation [7], [8]. The solarwind plasma is injected from the left boundary.…”
Section: Applicationsmentioning
confidence: 99%
See 1 more Smart Citation
“…The present parallel Vlasov code is a one of a few examples of successful hyperdimensional Vlasov simulations which has been applied to studies of practical problems of plasma physics, such as the magnetic reconnection [10,11], the Kelvin-Helmholtz instability [12,13], and the global interaction between the solar wind and a small astronomical body with a spatial scale of ion gyro radius [14,15,16]. Our parallel Vlasov code is designed with a high-accuracy and memory-saving scheme, especially for recent supercomputer systems with a small shared memory, and a memory size of 1GB per core is enough for stable and high-performance computation.…”
Section: Overview Of Basic Equations and Numerical Schemesmentioning
confidence: 99%
“…The Vlasov simulation can use a grid spacing longer than the Debye length as long as the E × B drift motion of thermal plasma is resolved appropriately (Umeda et al 2009(Umeda et al , 2010. Thus, the Vlasov simulation can handle solar system bodies with the spatial scale of several tens of the gyro radius of thermal electrons (r e ) (Umeda et al 2011;Umeda 2012a;Umeda and Ito 2014). One critical drawback of the Vlasov simulation is that huge computational resources are required.…”
Section: Introductionmentioning
confidence: 99%