{"id":117122,"date":"2024-10-07T15:00:25","date_gmt":"2024-10-07T07:00:25","guid":{"rendered":"https:\/\/www.tm-robot.com\/?post_type=docs&#038;p=117122"},"modified":"2024-10-07T18:35:13","modified_gmt":"2024-10-07T10:35:13","slug":"optimize-your-object-handling-with-upward-looking-camera","status":"publish","type":"docs","link":"https:\/\/www2.tm-robot.com\/zh-hant\/docs\/optimize-your-object-handling-with-upward-looking-camera\/","title":{"rendered":"Optimize Your Object Handling with Upward-Looking Camera"},"content":{"rendered":"<p>Examples are valid for:<br \/>\nTMflow Software version:\u00a0 All.<br \/>\nTM Robot Hardware version: All<br \/>\nOther specific requirements:<\/p>\n<ul>\n<li>Tool Shift &#8211; TMflow Manual<\/li>\n<li>Upward-looking camera &#8211; TMvision manual<\/li>\n<\/ul>\n<p>Note that older or newer software versions may have different results.<\/p>\n<hr \/>\n<h1>Introduction<\/h1>\n<p>In pick and place operations, common industrial processes typically use robots to perform pick and place operations, starting from a picking position and moving to a placing position. This methods might require precise positoning when teaching the target points and frequent recalibration, which sometimes can be time-consuming, especially when dealing with tiny object or target point is difficult to reach.<\/p>\n<p>But have you ever wondered if you achieve the same results by reversing the sequence? \u2014 setting up and placing the workpiece first, followed by picking it up for vision inspection, and adjusting based on the visual data for any positional discrepancies before final positioning.<\/p>\n<p><a href=\"https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728282011295.jpg\"><img loading=\"lazy\" class=\"wp-image-117124 aligncenter\" src=\"https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728282011295.jpg\" alt=\"\" width=\"613\" height=\"407\" srcset=\"https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728282011295.jpg 865w, https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728282011295-300x199.jpg 300w, https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728282011295-768x510.jpg 768w, https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728282011295-360x239.jpg 360w\" sizes=\"(max-width: 613px) 100vw, 613px\" \/><\/a><\/p>\n<p>Pick-and-place operations at specific locations are best achieved using an external upward-looking camera with position compensation and tool shift. This method helps reduce the need for frequent teaching, saving time by allowing the tool to adjust its position accurately at the target point without requiring additional manual intervention.<\/p>\n<h1>Optimal Scenario for Applying This Method<\/h1>\n<p><a href=\"https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/gif1-shift-left-right.gif\"><img loading=\"lazy\" class=\"aligncenter wp-image-117130\" src=\"https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/gif1-shift-left-right.gif\" alt=\"\" width=\"343\" height=\"193\" \/><\/a><\/p>\n<h2>1. Difficult Target Locations<\/h2>\n<p>In traditional pick-and-place systems, achieving precision when the target position is in a difficult or obstructed location can be challenging. For example, when a robot arm needs to place an object inside a small cavity, angled surface, or into a confined space where direct line-of-sight is limited, traditional methods may struggle with accuracy. These hard-to-reach areas may also require specific orientations for the object to fit properly. Using upward-looking camera with tool shift functionality might overcomes the challenges by providing real-time positional data, allowing for precise adjustments even when the target is not easily visible. This eliminates the need for multiple teaching sessions and enhances accuracy in complex environments.<\/p>\n<h2>2. Tiny Objects or Small Tolerance<\/h2>\n<p>When dealing with tiny objects, such as micro-components in electronics manufacturing or small mechanical parts, the margin for error is extremely small. Traditional pick-and-place methods that rely on pre-recorded points may not achieve the high level of precision required, especially if the object must be placed within a tight tolerance range (e.g., less than 0.1 mm). In these scenarios, an upward-looking camera combined with tool shift can improve precision by detecting and fixing position errors after picking up the object to ensure the object is perfectly aligned before placement.<\/p>\n<h2>3. Fast Speed : Small Batch, High Variety Production<\/h2>\n<p>In environments where small batches of diverse products are common such as custom manufacturing, manual teaching for each product can consume significant time. Traditional pick-and-place systems might require individual setup for each object type, but by utilizing an upward-looking camera with tool shift, you can reduce setup time significantly. This method enables a single teaching session to adapt to multiple objects, streamlining production for small-batch, high-variety environments. The ability to rapidly shift between different products minimizes downtime and maximizes efficiency.<\/p>\n<h2>4. Quality : Reducing Human Error in Precision Tasks<\/h2>\n<p>Human error is often a major factor in inconsistent quality during manual pick-and-place tasks, particularly when precision is required. Manual labor causing variability in positioning, which can result in misaligned or defective placements. By automating the process with an upward-looking camera and tool shift, this method ensures consistent accuracy and precision, reducing the risk of defects caused by misalignment. The system can self-correct and adjust on the fly, offering a stable and repeatable process that enhances product quality, especially for tasks that demand high precision.<\/p>\n<p><strong>Real Case Example:<\/strong> In semiconductor manufacturing, placing tiny chips on a board with high accuracy is critical. The upward-looking camera enables the robot to adjust for any micro-level misalignments and achieve placement with minimal error.<\/p>\n<h1>Process Workflow<\/h1>\n<p><a href=\"https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728282694676.jpg\"><img loading=\"lazy\" class=\"aligncenter wp-image-117136\" src=\"https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728282694676.jpg\" alt=\"\" width=\"661\" height=\"343\" srcset=\"https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728282694676.jpg 1072w, https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728282694676-300x156.jpg 300w, https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728282694676-1024x532.jpg 1024w, https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728282694676-768x399.jpg 768w, https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728282694676-360x187.jpg 360w\" sizes=\"(max-width: 661px) 100vw, 661px\" \/><\/a><\/p>\n<p><a href=\"https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728282730674.jpg\"><img loading=\"lazy\" class=\"aligncenter size-full wp-image-117142\" src=\"https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728282730674.jpg\" alt=\"\" width=\"866\" height=\"59\" srcset=\"https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728282730674.jpg 866w, https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728282730674-300x20.jpg 300w, https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728282730674-768x52.jpg 768w, https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728282730674-360x25.jpg 360w\" sizes=\"(max-width: 866px) 100vw, 866px\" \/><\/a><\/p>\n<h2><span style=\"color: #ff0000;\">TEACH THE TARGET POSITION :\u00a0<\/span><\/h2>\n<p><span style=\"color: #ff0000;\"><strong>Note : Make sure a workspace has been created for Upward Looking \u2013 Position Compensation before starting this project.<\/strong><\/span><\/p>\n<ol>\n<li>Place the object at the target (place) position and record the point.<\/li>\n<\/ol>\n<p><a href=\"https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728282952459.jpg\"><img loading=\"lazy\" class=\"aligncenter wp-image-117148\" src=\"https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728282952459.jpg\" alt=\"\" width=\"236\" height=\"177\" srcset=\"https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728282952459.jpg 366w, https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728282952459-300x225.jpg 300w, https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728282952459-360x270.jpg 360w\" sizes=\"(max-width: 236px) 100vw, 236px\" \/><\/a><\/p>\n<p>2.\u00a0Bring the object with the tool and create an upward-looking Vision Job, referred to as \u201c<span style=\"color: #ff0000;\">UPL<\/span>\u201d in this example.<\/p>\n<p>Then, <span style=\"color: #ff0000;\">[save as]<\/span> another Vision Job based on this one, called \u2018<span style=\"color: #ff0000;\">UPL1<\/span>\u2019 in this example.<\/p>\n<p>(Note that both vision jobs should be created within the same vision job to ensure alignment of features and reference points, so the compensated points will remain accurate after the tool shift. )<\/p>\n<p><a href=\"https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728283032354.jpg\"><img loading=\"lazy\" class=\"aligncenter wp-image-117154\" src=\"https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728283032354.jpg\" alt=\"\" width=\"571\" height=\"182\" srcset=\"https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728283032354.jpg 774w, https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728283032354-300x96.jpg 300w, https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728283032354-768x245.jpg 768w, https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728283032354-360x115.jpg 360w\" sizes=\"(max-width: 571px) 100vw, 571px\" \/><\/a><\/p>\n<p>3. Copy the target position point from Step 1 and <span style=\"color: #ff0000;\"><u>overwrite the tool<\/u><\/span> to use the TCP generated by the first vision job \u2018UPL\u2019 <span style=\"color: #ff0000;\">V1<\/span> in this example.<\/p>\n<p><a href=\"https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728283098432.jpg\"><img loading=\"lazy\" class=\"aligncenter wp-image-117160\" src=\"https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728283098432.jpg\" alt=\"\" width=\"277\" height=\"208\" srcset=\"https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728283098432.jpg 366w, https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728283098432-300x225.jpg 300w, https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728283098432-360x270.jpg 360w\" sizes=\"(max-width: 277px) 100vw, 277px\" \/><\/a><\/p>\n<h2><span style=\"color: #ff0000;\">TEACH THE INITIAL POSITION :\u00a0<\/span><\/h2>\n<p>4. Move your object to the initial position (picking point), then drag your robot to the picking point and record the point.<\/p>\n<p><a href=\"https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728283156584.jpg\"><img loading=\"lazy\" class=\"aligncenter wp-image-117166\" src=\"https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728283156584.jpg\" alt=\"\" width=\"268\" height=\"202\" srcset=\"https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728283156584.jpg 378w, https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728283156584-300x225.jpg 300w, https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728283156584-360x270.jpg 360w\" sizes=\"(max-width: 268px) 100vw, 268px\" \/><\/a><\/p>\n<p>5. Drag a vision node and select \u2018<span style=\"color: #ff0000;\">UPL1<\/span>\u2019 as the Vision Job in this example.<\/p>\n<p><a href=\"https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728283205980.jpg\"><img loading=\"lazy\" class=\"aligncenter wp-image-117172\" src=\"https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728283205980.jpg\" alt=\"\" width=\"277\" height=\"207\" srcset=\"https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728283205980.jpg 338w, https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728283205980-300x225.jpg 300w\" sizes=\"(max-width: 277px) 100vw, 277px\" \/><\/a><\/p>\n<p>6. <span style=\"color: #ff0000;\">Copy<\/span> the target point created in <span style=\"color: #ff0000;\">step 3<\/span>. Then click on the pencil icon to edit the point on TMflow, select [<span style=\"color: #ff0000;\"><strong>TOOL SHIFT<\/strong> to vision TCP_UPL1<\/span> in this example] and choose [<span style=\"color: #ff0000;\"><strong>KEEP PATH<\/strong><\/span>]<\/p>\n<p><img loading=\"lazy\" class=\"aligncenter wp-image-117178\" src=\"https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728283732894.jpg\" alt=\"\" width=\"597\" height=\"205\" srcset=\"https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728283732894.jpg 667w, https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728283732894-300x103.jpg 300w, https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728283732894-360x124.jpg 360w\" sizes=\"(max-width: 597px) 100vw, 597px\" \/><\/p>\n<p><a href=\"https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728283779727.jpg\"><img loading=\"lazy\" class=\"aligncenter wp-image-117184\" src=\"https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728283779727.jpg\" alt=\"\" width=\"393\" height=\"252\" srcset=\"https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728283779727.jpg 518w, https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728283779727-300x192.jpg 300w, https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728283779727-360x231.jpg 360w\" sizes=\"(max-width: 393px) 100vw, 393px\" \/><\/a><\/p>\n<p><a href=\"https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/7F82FF35-A78A-4C81-9802-3F5C0AF218F7.jpg\"><img loading=\"lazy\" class=\"aligncenter wp-image-117196\" src=\"https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/7F82FF35-A78A-4C81-9802-3F5C0AF218F7.jpg\" alt=\"\" width=\"562\" height=\"755\" srcset=\"https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/7F82FF35-A78A-4C81-9802-3F5C0AF218F7.jpg 1000w, https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/7F82FF35-A78A-4C81-9802-3F5C0AF218F7-223x300.jpg 223w, https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/7F82FF35-A78A-4C81-9802-3F5C0AF218F7-762x1024.jpg 762w, https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/7F82FF35-A78A-4C81-9802-3F5C0AF218F7-768x1031.jpg 768w, https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/7F82FF35-A78A-4C81-9802-3F5C0AF218F7-360x483.jpg 360w\" sizes=\"(max-width: 562px) 100vw, 562px\" \/><\/a><\/p>\n<p><a href=\"https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728284264119.jpg\"><img loading=\"lazy\" class=\"aligncenter size-full wp-image-117202\" src=\"https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728284264119.jpg\" alt=\"\" width=\"434\" height=\"367\" srcset=\"https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728284264119.jpg 434w, https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728284264119-300x254.jpg 300w, https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728284264119-360x304.jpg 360w\" sizes=\"(max-width: 434px) 100vw, 434px\" \/><\/a><\/p>\n<p>Using TOOL SHIFT and KEEP PATH together lets the robot correct for any deviation in the object&#8217;s position or orientation after picking it up. The upward-looking camera calculates the adjustment, and the tool shifts from V1 to V2 to place the object accurately without changing the robot\u2019s programmed path.<\/p>\n<p>&nbsp;<\/p>\n<h2><span style=\"color: #ff0000;\"><strong>FINAL &#8211; PLAY THE PROJECT<\/strong><\/span><\/h2>\n<p>After teaching the target and initial position, you can start your project directly from the initial position (step 4) on TMflow and <span style=\"color: #ff0000;\"><strong>exclude <\/strong><\/span>the vision job process<strong> [start &#8211;&gt;<\/strong><strong> step 4 &#8211;&gt;<\/strong><strong> step 6]<\/strong><\/p>\n<h1><strong>Conclusions<\/strong><\/h1>\n<p>Integrating an upward-looking camera with tool-shift functionality offers a streamlined and precise solution for performing pick-and-place operations from a specific point to another without the need for vision-based feedback. Unlike traditional methods that simply record pick and target points, this approach leverages positional compensation and tool shift adjustments to ensure accuracy. This method particularly suitable for cases involving tiny objects with very precise positions and minimal tolerances, such as placing components on CNC machines or chips on PCBs. With one-time teaching and real-time corrections, the system delivers precise placement without repeated calibrations, enhancing efficiency.<\/p>\n<p>&nbsp;<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Examples are valid for: TMflow Software version:\u00a0 All.  [&hellip;]<\/p>\n","protected":false},"author":8760,"featured_media":0,"parent":0,"comment_status":"closed","ping_status":"closed","template":"","meta":[],"doc_category":[4148],"doc_tag":[],"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v16.9 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>Optimize Your Object Handling with Upward-Looking Camera | Techman Robot<\/title>\n<meta name=\"robots\" content=\"noindex, follow\" \/>\n<meta property=\"og:locale\" content=\"zh_TW\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Optimize Your Object Handling with Upward-Looking Camera | Techman Robot\" \/>\n<meta property=\"og:description\" content=\"Examples are valid for: TMflow Software version:\u00a0 All. [&hellip;]\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www2.tm-robot.com\/zh-hant\/docs\/optimize-your-object-handling-with-upward-looking-camera\/\" \/>\n<meta property=\"og:site_name\" content=\"Techman Robot\" \/>\n<meta property=\"article:modified_time\" content=\"2024-10-07T10:35:13+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728282011295.jpg\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Organization\",\"@id\":\"https:\/\/www.tm-robot.com\/de\/#organization\",\"name\":\"Techman Robot\",\"url\":\"https:\/\/www.tm-robot.com\/de\/\",\"sameAs\":[],\"logo\":{\"@type\":\"ImageObject\",\"@id\":\"https:\/\/www.tm-robot.com\/de\/#logo\",\"inLanguage\":\"zh-TW\",\"url\":\"https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2018\/09\/logo.png\",\"contentUrl\":\"https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2018\/09\/logo.png\",\"width\":221,\"height\":196,\"caption\":\"Techman Robot\"},\"image\":{\"@id\":\"https:\/\/www.tm-robot.com\/de\/#logo\"}},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/www.tm-robot.com\/de\/#website\",\"url\":\"https:\/\/www.tm-robot.com\/de\/\",\"name\":\"Techman Robot\",\"description\":\"Intelligent Cobots for a World of Applications\",\"publisher\":{\"@id\":\"https:\/\/www.tm-robot.com\/de\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/www.tm-robot.com\/de\/?s={search_term_string}\"},\"query-input\":\"required name=search_term_string\"}],\"inLanguage\":\"zh-TW\"},{\"@type\":\"ImageObject\",\"@id\":\"https:\/\/www2.tm-robot.com\/zh-hant\/docs\/optimize-your-object-handling-with-upward-looking-camera\/#primaryimage\",\"inLanguage\":\"zh-TW\",\"url\":\"https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728282011295.jpg\",\"contentUrl\":\"https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728282011295.jpg\"},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/www2.tm-robot.com\/zh-hant\/docs\/optimize-your-object-handling-with-upward-looking-camera\/#webpage\",\"url\":\"https:\/\/www2.tm-robot.com\/zh-hant\/docs\/optimize-your-object-handling-with-upward-looking-camera\/\",\"name\":\"Optimize Your Object Handling with Upward-Looking Camera | Techman Robot\",\"isPartOf\":{\"@id\":\"https:\/\/www.tm-robot.com\/de\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/www2.tm-robot.com\/zh-hant\/docs\/optimize-your-object-handling-with-upward-looking-camera\/#primaryimage\"},\"datePublished\":\"2024-10-07T07:00:25+00:00\",\"dateModified\":\"2024-10-07T10:35:13+00:00\",\"breadcrumb\":{\"@id\":\"https:\/\/www2.tm-robot.com\/zh-hant\/docs\/optimize-your-object-handling-with-upward-looking-camera\/#breadcrumb\"},\"inLanguage\":\"zh-TW\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/www2.tm-robot.com\/zh-hant\/docs\/optimize-your-object-handling-with-upward-looking-camera\/\"]}]},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/www2.tm-robot.com\/zh-hant\/docs\/optimize-your-object-handling-with-upward-looking-camera\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/www.tm-robot.com\/zh-hant\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Docs\",\"item\":\"https:\/\/www2.tm-robot.com\/zh-hant\/docs\/\"},{\"@type\":\"ListItem\",\"position\":3,\"name\":\"Optimize Your Object Handling with Upward-Looking Camera\"}]}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Optimize Your Object Handling with Upward-Looking Camera | Techman Robot","robots":{"index":"noindex","follow":"follow"},"og_locale":"zh_TW","og_type":"article","og_title":"Optimize Your Object Handling with Upward-Looking Camera | Techman Robot","og_description":"Examples are valid for: TMflow Software version:\u00a0 All. [&hellip;]","og_url":"https:\/\/www2.tm-robot.com\/zh-hant\/docs\/optimize-your-object-handling-with-upward-looking-camera\/","og_site_name":"Techman Robot","article_modified_time":"2024-10-07T10:35:13+00:00","og_image":[{"url":"https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728282011295.jpg"}],"twitter_card":"summary_large_image","schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Organization","@id":"https:\/\/www.tm-robot.com\/de\/#organization","name":"Techman Robot","url":"https:\/\/www.tm-robot.com\/de\/","sameAs":[],"logo":{"@type":"ImageObject","@id":"https:\/\/www.tm-robot.com\/de\/#logo","inLanguage":"zh-TW","url":"https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2018\/09\/logo.png","contentUrl":"https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2018\/09\/logo.png","width":221,"height":196,"caption":"Techman Robot"},"image":{"@id":"https:\/\/www.tm-robot.com\/de\/#logo"}},{"@type":"WebSite","@id":"https:\/\/www.tm-robot.com\/de\/#website","url":"https:\/\/www.tm-robot.com\/de\/","name":"Techman Robot","description":"Intelligent Cobots for a World of Applications","publisher":{"@id":"https:\/\/www.tm-robot.com\/de\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.tm-robot.com\/de\/?s={search_term_string}"},"query-input":"required name=search_term_string"}],"inLanguage":"zh-TW"},{"@type":"ImageObject","@id":"https:\/\/www2.tm-robot.com\/zh-hant\/docs\/optimize-your-object-handling-with-upward-looking-camera\/#primaryimage","inLanguage":"zh-TW","url":"https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728282011295.jpg","contentUrl":"https:\/\/tm-robot.oss-cn-hongkong.aliyuncs.com\/wp-content\/uploads\/2024\/10\/messageImage_1728282011295.jpg"},{"@type":"WebPage","@id":"https:\/\/www2.tm-robot.com\/zh-hant\/docs\/optimize-your-object-handling-with-upward-looking-camera\/#webpage","url":"https:\/\/www2.tm-robot.com\/zh-hant\/docs\/optimize-your-object-handling-with-upward-looking-camera\/","name":"Optimize Your Object Handling with Upward-Looking Camera | Techman Robot","isPartOf":{"@id":"https:\/\/www.tm-robot.com\/de\/#website"},"primaryImageOfPage":{"@id":"https:\/\/www2.tm-robot.com\/zh-hant\/docs\/optimize-your-object-handling-with-upward-looking-camera\/#primaryimage"},"datePublished":"2024-10-07T07:00:25+00:00","dateModified":"2024-10-07T10:35:13+00:00","breadcrumb":{"@id":"https:\/\/www2.tm-robot.com\/zh-hant\/docs\/optimize-your-object-handling-with-upward-looking-camera\/#breadcrumb"},"inLanguage":"zh-TW","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www2.tm-robot.com\/zh-hant\/docs\/optimize-your-object-handling-with-upward-looking-camera\/"]}]},{"@type":"BreadcrumbList","@id":"https:\/\/www2.tm-robot.com\/zh-hant\/docs\/optimize-your-object-handling-with-upward-looking-camera\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/www.tm-robot.com\/zh-hant\/"},{"@type":"ListItem","position":2,"name":"Docs","item":"https:\/\/www2.tm-robot.com\/zh-hant\/docs\/"},{"@type":"ListItem","position":3,"name":"Optimize Your Object Handling with Upward-Looking Camera"}]}]}},"_links":{"self":[{"href":"https:\/\/www2.tm-robot.com\/zh-hant\/wp-json\/wp\/v2\/docs\/117122"}],"collection":[{"href":"https:\/\/www2.tm-robot.com\/zh-hant\/wp-json\/wp\/v2\/docs"}],"about":[{"href":"https:\/\/www2.tm-robot.com\/zh-hant\/wp-json\/wp\/v2\/types\/docs"}],"author":[{"embeddable":true,"href":"https:\/\/www2.tm-robot.com\/zh-hant\/wp-json\/wp\/v2\/users\/8760"}],"replies":[{"embeddable":true,"href":"https:\/\/www2.tm-robot.com\/zh-hant\/wp-json\/wp\/v2\/comments?post=117122"}],"version-history":[{"count":2,"href":"https:\/\/www2.tm-robot.com\/zh-hant\/wp-json\/wp\/v2\/docs\/117122\/revisions"}],"predecessor-version":[{"id":133113,"href":"https:\/\/www2.tm-robot.com\/zh-hant\/wp-json\/wp\/v2\/docs\/117122\/revisions\/133113"}],"wp:attachment":[{"href":"https:\/\/www2.tm-robot.com\/zh-hant\/wp-json\/wp\/v2\/media?parent=117122"}],"wp:term":[{"taxonomy":"doc_category","embeddable":true,"href":"https:\/\/www2.tm-robot.com\/zh-hant\/wp-json\/wp\/v2\/doc_category?post=117122"},{"taxonomy":"doc_tag","embeddable":true,"href":"https:\/\/www2.tm-robot.com\/zh-hant\/wp-json\/wp\/v2\/doc_tag?post=117122"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}