{ "citation": "@article{doersch2023tapir,\n title={TAPIR: Tracking Any Point with per-frame Initialization\n and temporal Refinement},\n author={Doersch, Carl and Yang, Yi and Vecerik, Mel and Gokay, Dilara and\n Gupta, Ankush and Aytar, Yusuf and Carreira, Joao and Zisserman, Andrew},\n journal={ICCV},\n year={2023}\n}", "configDescription": "Full resolution of 256x256", "configName": "256x256", "description": "A simple rigid-body simulation with GSO objects and an HDRI background.\nThe scene consists of a dome (half-sphere) onto which a random HDRI is projected, \nwhich acts as background, floor and lighting.\nThe scene contains between 10 and 20 random static objects, and between 1 and 3\ndynamic objects (tossed onto the others).\nThe camera moves on a straight line with constant velocity, and the lookat point\nfollows a moving trajectory through the workspace center.\nThe starting point is sampled randomly in a half-sphere shell around the scene,\nand from there the camera moves into a random direction with a random speed between 0 and 8.\nThis sampling process is repeated until a trajectory is found that starts and \nends within the specified half-sphere shell around the center of the scene. \nThe camera always points towards the origin.\n\nStatic objects are spawned without overlap in the region [(-7, -7, 0), (7, 7, 10)],\nand are simulated to fall and settle before the first frame of the scene.\nDynamic objects are spawned without overlap in the region [(-5, -5, 1), (5, 5, 5)], and\ninitialized with a random velocity from the range [(-4, -4, 0), (4, 4, 0)]\nminus the position of the object to bias their trajectory towards the center of\nthe scene.\n\nThe scene is simulated for 2 seconds, with the physical properties of the\nobjects kept at the default of friction=0.5, restitution=0.5 and density=1.0.\n\nThe dataset contains approx 10k videos rendered at 256x256 pixels and 12fps.\n\nEach sample contains the following video-format data:\n(s: sequence length, h: height, w: width)\n\n- \"video\": (s, h, w, 3) [uint8]\n The RGB frames.\n- \"segmentations\": (s, h, w, 1) [uint8]\n Instance segmentation as per-pixel object-id with background=0.\n Note: because of this the instance IDs used here are one higher than their\n corresponding index in sample[\"instances\"].\n- \"depth\": (s, h, w, 1) [uint16]\n Distance of each pixel from the center of the camera.\n (Note this is different from the z-value sometimes used, which measures the\n distance to the camera *plane*.)\n The values are stored as uint16 and span the range specified in\n sample[\"metadata\"][\"depth_range\"]. To convert them back to world-units\n use:\n minv, maxv = sample[\"metadata\"][\"depth_range\"]\n depth = sample[\"depth\"] / 65535 * (maxv - minv) + minv\n- \"forward_flow\": (s, h, w, 2) [uint16]\n Forward optical flow in the form (delta_row, delta_column).\n The values are stored as uint16 and span the range specified in\n sample[\"metadata\"][\"forward_flow_range\"]. To convert them back to pixels use:\n minv, maxv = sample[\"metadata\"][\"forward_flow_range\"]\n depth = sample[\"forward_flow\"] / 65535 * (maxv - minv) + minv\n- \"backward_flow\": (s, h, w, 2) [uint16]\n Backward optical flow in the form (delta_row, delta_column).\n The values are stored as uint16 and span the range specified in\n sample[\"metadata\"][\"backward_flow_range\"]. To convert them back to pixels use:\n minv, maxv = sample[\"metadata\"][\"backward_flow_range\"]\n depth = sample[\"backward_flow\"] / 65535 * (maxv - minv) + minv\n- \"normal\": (s, h, w, 3) [uint16]\n Surface normals for each pixel in world coordinates.\n- \"object_coordinates\": (s, h, w, 3) [uint16]\n Object coordinates encode the position of each point relative to the objects\n bounding box (i.e. back-left-top (X=Y=Z=1) corner is white,\n while front-right-bottom (X=Y=Z=0) corner is black.)\n\nAdditionally there is rich instance-level information in sample[\"instances\"]:\n- \"mass\": [float32]\n Mass of the object used for simulation.\n- \"friction\": [float32]\n Friction coefficient used for simulation.\n- \"restitution\": [float32]\n Restitution coefficient (bounciness) used for simulation.\n- \"positions\": (s, 3) [float32]\n Position of the object for each frame in world-coordinates.\n- \"quaternions\": (s, 4) [float32]\n Rotation of the object for each frame as quaternions.\n- \"velocities\": (s, 3) [float32]\n Velocity of the object for each frame.\n- \"angular_velocities\": (s, 3) [float32]\n Angular velocity of the object for each frame.\n- \"bboxes_3d\": (s, 8, 3) [float32]\n World-space corners of the 3D bounding box around the object.\n- \"image_positions\": (s, 2) [float32]\n Normalized (0, 1) image-space (2D) coordinates of the center of mass of the\n object for each frame.\n- \"bboxes\": (None, 4) [float32]\n The normalized image-space (2D) coordinates of the bounding box\n [ymin, xmin, ymax, xmax] for all the frames in which the object is visible\n (as specified in bbox_frames).\n- \"bbox_frames\": (None,) [int]\n A list of all the frames the object is visible.\n- \"visibility\": (s,) [uint16]\n Visibility of the object in number of pixels for each frame (can be 0).\n- \"asset_id\": [str] Asset id from Google Scanned Objects dataset. \n- \"category\": [\"Action Figures\", \"Bag\", \"Board Games\", \n \"Bottles and Cans and Cups\", \"Camera\", \"Car Seat\", \n \"Consumer Goods\", \"Hat\", \"Headphones\", \"Keyboard\", \"Legos\", \n \"Media Cases\", \"Mouse\", \"None\", \"Shoe\", \"Stuffed Toys\", \"Toys\"]\n- \"scale\": float between 0.75 and 3.0\n- \"is_dynamic\": bool indicating whether (at the start of the scene) the object \n is sitting on the floor or is being tossed. \n\nInformation about the camera in sample[\"camera\"]\n(given for each frame eventhough the camera is static, so as to stay\nconsistent with other variants of the dataset):\n\n- \"focal_length\": [float32]\n- \"sensor_width\": [float32]\n- \"field_of_view\": [float32]\n- \"positions\": (s, 3) [float32]\n- \"quaternions\": (s, 4) [float32]\n\n\nAnd finally information about collision events in sample[\"events\"][\"collisions\"]:\n\n- \"instances\": (2,)[uint16]\n Indices of the two instance between which the collision happened.\n Note that collisions with the floor/background objects are marked with 65535\n- \"frame\": tf.int32,\n Frame in which the collision happenend.\n- \"force\": tf.float32,\n The force (strength) of the collision.\n- \"position\": tfds.features.Tensor(shape=(3,), dtype=tf.float32),\n Position of the collision event in 3D world coordinates.\n- \"image_position\": tfds.features.Tensor(shape=(2,), dtype=tf.float32),\n Position of the collision event projected onto normalized 2D image coordinates.\n- \"contact_normal\": tfds.features.Tensor(shape=(3,), dtype=tf.float32),\n The normal-vector of the contact (direction of the force).", "fileFormat": "tfrecord", "location": { "urls": [ "https://github.com/google-research/kubric" ] }, "moduleName": "panning_movi_e", "name": "panning_movi_e", "releaseNotes": { "1.0.0": "initial release" }, "splits": [ { "filepathTemplate": "{DATASET}-{SPLIT}.{FILEFORMAT}-{SHARD_X_OF_Y}", "name": "train", "numBytes": "285614954814", "shardLengths": [ "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9", "10", "9" ] }, { "filepathTemplate": "{DATASET}-{SPLIT}.{FILEFORMAT}-{SHARD_X_OF_Y}", "name": "validation", "numBytes": "7315117469", "shardLengths": [ "4", "4", "4", "4", "3", "4", "4", "4", "4", "4", "4", "3", "4", "4", "4", "4", "4", "4", "4", "4", "3", "4", "4", "4", "4", "4", "4", "3", "4", "4", "4", "4", "4", "4", "4", "4", "3", "4", "4", "4", "4", "4", "4", "3", "4", "4", "4", "4", "4", "4", "4", "4", "3", "4", "4", "4", "4", "4", "4", "3", "4", "4", "4", "4" ] } ], "version": "1.0.0" }