2013年6月27日星期四

The Natural User Interface

The NUI (Natural User Interface) for the 2D GUI must recognize and trace the captured 3D image to transform to the UI event(s) first, but I don't care about that because I only do the 3D VR/AR UI. The NUI in the 3D VR/AR UI just directly maps the captured 3D image to the virtual reality to become the model added in it then lets themselves interact, so it is not necessary to have the event processing. In the 3D AR/VR UI, the pattern recognition and trace only occurs in the 3D AR/VR space.

2013年6月26日星期三

The Really Real Virtual Reality: virtual-real synchronization system


VR = R ➡️ VR + VR objects = AR 


The ARDOM ( Augmented Reality Distributed On Mobiles ) maps each minute of the real world by the moving of a large number of users on the mobile network cooperating to scan the real world into the distributed virtual reality with the radar or lidar mobile devices: correspond each pixel from the captured image to the position of ( the located coordinate + the 3D vector * the radar or lidar distance ) in the 3+1D space. The 4D space will also include our bodies and even our neural signals ( our souls ) so welcome to 《The Matrix》.  The objects with the access permissions in the augmented reality are not all stored in one place, they are distributed and are cached only as anyone possibly needs them to present in the augmented reality. Everything occurring in the real world is scanned into the virtual reality; everything happening in the virtual reality could change the real world too but not must be immediately. It could simulate until making sure everything is ok and there is no problem, then let the drones update the real world according to the new right virtual reality. 



The augmented reality does not only embed some virtual objects in the real scanned scenes but also changes the real world according to some of the virtual ones by the drones. The viewers watch the virtual reality from the Cloud directly but do not change it, they change the actors in the real world and then the sensors in the actors update the virtual reality in the Cloud. In the pure simulation mode, the actors with sensors are replaced by virtual ones in the virtual reality in the Cloud. However, the Cloud is not servers, the Cloud is the distribution. The actors in the real world are changed by accessing the Cloud and they are the synchronous parts of the Cloud distribution too. 

The most basic virtual-real synchronization does not update actions from users both to VR and R at the same time, only syncs to R, and then updates VR directly by the change of R. Because there may be exceptions of failure to update to R, and then VR must be restored to before the update.

However, some objects existing in VR but not in R also need to pretend to exist in R, so it is necessary to move objects in VR first to know how to move in R. But, in the process of updating actions from users to VR and then VR to R, it may also encounter objects that have not yet had time to be updated from R to VR. So, in the process of updating actions from users to VR and R at this same time, if one of them meets an exception, it is inevitable to restore both to before actions.

As for switching to the simulation mode decoupled from reality, just turn off the sensing and actuation to reality. 


The NUI (Natural User Interface) for the 2D GUI must recognize and trace the captured 3D image to transform to the UI event(s) first, but I don't care about that because I only do the 3D VR/AR UI. The NUI in the 3D VR/AR UI just directly maps the captured 3D image to the virtual reality to become the model added in it and then lets themselves interact, so it is not necessary to have the event processing. In the 3D VR/AR UI, the pattern recognition and trace only occur in the 3D VR/AR space. 



The virtual-real synchronization system allows you to interact with physical objects anywhere worldwide as long as you have access permission. The ultimate realm is the whole screen presenting the scene of the soul out of the body instantly moving to the other side of the world or the perspective of God, and you can watch and interact with any place in the world through the screen in one place, which is equivalent to the reality and the virtual world are always updated in both directions. 



3D LiDAR Technology

SpaceTop 3D interface lets you reach inside your computer screen

2013年6月25日星期二

電離流體磁導面鈑

空特部各基地間經常互相有不定期不定時無預警的空降突襲演練,有一晚半夜我老弟聽覺感覺到運輸機旋槳的音頻但旁人都沒感覺到說他想太多、多心了,那時運輸機還離基地很遠,基地指揮官抱著不演習白不演習的想法下令備戰,結果當別的單位的空降部隊落到地面時基地部隊老早就埋伏等在那邊了,一鎗未發全數俘虜。旋槳渦輪的噪音來自與空氣粒子的撞擊震動,尤其是推進方向氣流對葉片面上的衝擊,目前電動車安靜無聲但電動飛機和電動船艇仍無法克服噪音問題。倘若有一種面鈑可以電離表面流體使其帶電以磁導流離面成風流即可以完全切面除去風阻噪音,在離子流滑過磁導面鈑時亦導致面鈑裡未通電部份導體產生電流充電,離子流滑過磁導面鈑後即電中和不會留下離子航跡。

2013年6月11日星期二

The dynamic network of the mobile spheres

Give you the limitless number of spheres fully with the directional signal cells on their surfaces, please use them to design a network. Further, if the spheres are moving dynamically, how do you make the network? Please design a mechanism for the dynamic network. This may be for the communication among planets far out, and the same architecture also be on a planet or in space nearly. If we cut the sphere in half, it becomes a hat. Establish the network among these ever moving half sphere hats without any base station, so the hat itself is the ever-moving base station in the enemy territory, and must only use the directional signals away from the enemy detection. The same mechanism also can be applied to the communication among the planets in the outer space. There is no way to have a fixed base station too because the planets are always in revolution and rotation. It can not use the nondirectional signal too because they are the big planets. 



The location-based networking: Everyone owns the relative coordinates map created by scanning the neighborhood and merges another far areas from the far ones through the near ones. The more memories one owns; the bigger map one owns. As one wants to transfer something to the other one, decides the path according to the relative coordinates and their states on the map.