ENG New site

Advanced search

[ New messages · Forum rules · Members ]
  • Page 1 of 1
  • 1
GAIA first batch of parallax measurements out
IngolifsDate: Wednesday, 14.09.2016, 23:34 | Message # 1
Space Tourist
Group: Users
New Zealand
Messages: 30
Status: Offline
http://sci.esa.int/gaia....to-come

"On its way to assembling the most detailed 3D map ever made of our Milky Way galaxy, Gaia has pinned down the precise position on the sky and the brightness of 1142 million stars."

GAIA is the successor to the Hipparcos mission - the mission that used parallax to find the distances to roughly 100000 nearby stars in the milky way (and thereby providing the coordinates for most of the catalogue stars in SpaceEngine). With Gaia, this number looks set to increase to one billion, and so in time, I think we can expect to see a lot fewer procedural stars in the milky way galaxy in SE.
 
AlekDate: Thursday, 15.09.2016, 00:07 | Message # 2
Pioneer
Group: Users
United States
Messages: 326
Status: Offline
Can you imagine the download time though?? For a lot of people it already takes hours downloading SE-- having that many stars would take it twice as long if not more to download!




Living among the stars, I find my way. I grow in strength through knowledge of the space I occupy, until I become the ruler of my own interstellar empire of sorts. Though The world was made for the day, I was made for the night, and thus, the universe itself is within my destiny.
 
steeljaw354Date: Thursday, 15.09.2016, 00:14 | Message # 3
World Builder
Group: Users
Pirate
Messages: 862
Status: Offline
Why not have many files that are sort of like an addon?
 
WatsisnameDate: Thursday, 15.09.2016, 01:08 | Message # 4
Galaxy Architect
Group: Global Moderators
United States
Messages: 2613
Status: Offline
Quote Alek ()
Can you imagine the download time though?? For a lot of people it already takes hours downloading SE-- having that many stars would take it twice as long if not more to download!


There's an addon for Celestia which adds 1 or 2 million stars, the files being 17 and 35 MB, respectively. That's about 17 bytes per star. I'm not sure how much data it takes to show catalog stars in SE, but at that rate, to represent the Gaia catalog would take about 19GB. Pretty big. smile





 
JackDoleDate: Thursday, 15.09.2016, 11:37 | Message # 5
Star Engineer
Group: Local Moderators
Germany
Messages: 1742
Status: Offline
The HIPPARCOS.csv catalog in SE needs about 70 bytes per star. That would be about 70 gigabytes for a billion stars. Packed I guess about 25 gigabytes.
And the HIPPARCOS.csv is not complete. There are no data about the radius, mass and temperature of the stars. With this information it would be considerably greater.

One would have to return to a binary form of the catalog.
Like the catalog in Celestia. This requires significantly fewer bytes.

Would naturally be still too large, since SE can not handle such large files.

Perhaps we should divide the Milky Way into Quadrants.





Don't forget to look here.

 
SpaceEngineerDate: Thursday, 15.09.2016, 13:33 | Message # 6
Author of Space Engine
Group: Administrators
Russian Federation
Messages: 4800
Status: Offline
First of all, you must always keep in mind what SE required 3-dimensional coordinates of a star. This must be obvious. First GAIA release provides parallaxes (i.e. distance) for only 1 million stars. Other billion stars have only 2 coordinates provided - rights ascension and declination. SE can easily handle 1 million stars catalog.

Second, to handle 1 billion stars (when final data set will be released in 2022), some adaptive loading/rendering algorithm is needed, similar to those used for planet textures. Space must be subdivided in cubes which are streamed from the disk as needed. However, maybe in 2022 one can simply load everything into memory, like SE handles HIPPARCOS catalog now smile





 
ProteusDate: Thursday, 15.09.2016, 15:35 | Message # 7
Explorer
Group: Users
United States
Messages: 173
Status: Offline
What about an option of having the data streamed from an online server as they are approached / searched for?




 
n0b0dyDate: Thursday, 15.09.2016, 16:13 | Message # 8
Explorer
Group: Users
Pirate
Messages: 297
Status: Offline
Given the fact that we also have the SDSS catalog, if we combine this with the GAIA catalog (when it will be ready for release), would it help if SE goes 64 bit? (I know this is a lot of work for Space Engineer but if there is a quick way / workaround for this, it would be great)
 
DoctorOfSpaceDate: Thursday, 15.09.2016, 16:25 | Message # 9
Galaxy Architect
Group: Global Moderators
Pirate
Messages: 3600
Status: Offline
Quote n0b0dy ()
would it help if SE goes 64 bit?


If you use the search function you will find plenty of posts where SpaceEngineer has talked about 64 bit support

http://en.spaceengine.org/forum/21-3022-115#60416





Intel Core i7-5820K 4.2GHz 6-Core Processor
G.Skill Ripjaws V Series 32GB (4 x 8GB) DDR4-2400 Memory
EVGA GTX 980 Ti SC 6GB
 
form_d_kDate: Thursday, 22.09.2016, 23:29 | Message # 10
Astronaut
Group: Users
United States
Messages: 68
Status: Offline
Quote JackDole ()
The HIPPARCOS.csv catalog in SE needs about 70 bytes per star. That would be about 70 gigabytes for a billion stars. Packed I guess about 25 gigabytes.


If Celestia allocated about 17 bytes per star, my rough calculations is that the database could be reduced to ~7GB using 7zip (I guess that'd be LZMA compression algorithm). Doable. Painfully so. But doable. :P
 
form_d_kDate: Thursday, 22.09.2016, 23:46 | Message # 11
Astronaut
Group: Users
United States
Messages: 68
Status: Offline
I wonder if you could do something clever... have the database distributed, and SpaceEngine streams what it needs from the interwebs. And if there is no network connection, it falls back to generating stars procedurally?

Maybe another alternative would be to umm... reverse procedural generation?

For each database entry, determine what input the SpaceEngine procedural generation algorithm would require to generate that star. Once done for each entry, determine which stars have similar (or same??) characteristics, and store that in some sort of hash table? Then compress the result? I imagine if that did work, it would take a huge amount of CPU cycles to process.

I probably am talking nonsense. wacko
 
SpaceEngineerDate: Friday, 23.09.2016, 13:55 | Message # 12
Author of Space Engine
Group: Administrators
Russian Federation
Messages: 4800
Status: Offline
Quote form_d_k ()
If Celestia allocated about 17 bytes per star

Currenly SE uses this data per star:

Coordinates (XYZ or RA/Dec/Dist) - 3 floats (12 bytes)
Luminosity or app mag - 1 float (4 bytes)
Spectral class - currently packed to 1 short (2 bytes), but probably must be extended to 1 int (4 bytes).

So total 18 bytes (Celestia obviously uses at least the same data, so your value of 17 bytes is wrong). But don't forget about name or catalog ID. Currently SE uses a csv format where everything is a text, but older versions used a separate file with names, like Celestia. Adding an 32-bit index for the database ID (which is enough to identify 4 billions stars) adds 4 bytes.

Also, future versions of SE will use proper motion data, so stars in SE will have velocities (and maybe it will be possible to observe change in constellation shape, who knows). So add another 3 floats (12 bytes).

So total we'll have 34 bytes per star. Probably some floats may be packed to 16-bit half-precision (2 bytes), so maybe this can be reduced to 32 bytes (computers likes power-of-two).





 
SpaceEngineerDate: Friday, 23.09.2016, 14:03 | Message # 13
Author of Space Engine
Group: Administrators
Russian Federation
Messages: 4800
Status: Offline
Quote form_d_k ()
I wonder if you could do something clever... have the database distributed, and SpaceEngine streams what it needs from the interwebs. And if there is no network connection, it falls back to generating stars procedurally?

No. SE cannot determine if this is a place for real star or not. It generated procedural stars everywhere except the "bubble" with faded edges near the Sun (simply by skipping procedural stars which visual magnitude while looking from Earth would be larger than 7m). So if a catalog of real stars cannot be loaded, there will be a hole.

Streaming from the internet can be used in Planetarium, but not in the single-player game. If there are no internet connection, user can lost access to hist buildings or stations near those stars. The multiplayer game requires internet and access to the server, so this is not a problem for it aswell. The problem for both games is constantly updated catalogs. Science never stops, new stars and exoplanes will be added. But for the game, we should stop this at some point - a weekly/monthly change in the game Universe is not what players wants. Aswell as stop changes in procedural engine. New updated game client should use new server with updated real star/planet database.





 
form_d_kDate: Friday, 23.09.2016, 19:24 | Message # 14
Astronaut
Group: Users
United States
Messages: 68
Status: Offline
Yeah, and I was just thinking there's no good way to reverse a procedural algorithm (take known output, generate seed that would produce that output). I think it's possible, but depending on the size of the seed the time it would take to do so is very unrealistic.
 
FastFourierTransformDate: Monday, 26.09.2016, 20:11 | Message # 15
Pioneer
Group: Local Moderators
Spain
Messages: 542
Status: Offline
Quote SpaceEngineer ()
Also, future versions of SE will use proper motion data, so stars in SE will have velocities (and maybe it will be possible to observe change in constellation shape, who knows)


I'm going to cry of joy the day I see this cry

So, if I have understanded.
You are saying that for each star we should have position vector, velocity vector, name ID, spectral class and luminosity. A total of 36 bytes per star or 36GB for the full GAIA final catalogue. But GAIA will provide even more data;

At least 10% of GAIA's stars will have lightcurves so maybe we would have to add 4 bytes for the rotation of the star and maybe 12 bytes for changes in luminosity as a periodic function of time (I'm making rough estimates). This means 16 bytes for 100 million stars in addition. So 1,6 GB more.

From GAIA's spectrometry it can been inferred the metallicity (thus the age) of the star, the effective temperature (thus, the radius of the star and surface gravity), the mass of the star and other properties. So if we suppose that the spectrum of each star can be more or less encoded in 80 bytes (no more than 20 parameters) and that we have this for the 10% then it will add 8 GB more!
But this is important in my opinion because stars are more diverse that what is shown by the OBAFGKM classification. From spectral properties you can distinguish a Wolf Rayet from a O-type star for example.

All of this would make, in my opinion, for a very beautifull 45GB addon to Space Engine one day.
 
  • Page 1 of 1
  • 1
Search: