Thursday, July 25, 2019

Introduction to Augmented Reality and ARCore (4) - Bringing ARCore to life

Links
  1. Introduction to augmented reality (AR)
  2. The basics of AR functionality
  3. Taking the next steps with ARCore

Bringing ARCore to life


A Closer look at mechanics of ARCore

  • Surface detection allows ARCore to place digital objects on various surface heights, to render different objects at different sizes and positions, and to create more realistic AR experiences in general.
  • Pose is the position and orientation of any object in relation to the world around it. Everything has its own unique pose: from your mobile device to the augmented 3D asset that you see on your display.
  • Hit-testing lets you establish a pose for virtual objects and is the next step in the ARCore user process after feature-tracking (finding stationary feature points that inform the environmental understanding of the device) and plane-finding (the smartphone-specific process by which ARCore determines where horizontal surfaces are in your environment).
  • Light estimation is a process that allows the phone to estimate the environment's current lighting conditions. ARCore is able to detect objects in suboptimal light and map a room successfully, but it’s important to note that there is a limit to how low the light can be for the experience to function.
  • Occlusion is when one 3D object blocks another 3D object. Currently this is only possible with digital objects, and AR objects cannot be occluded by a real world object. For example, in an AR game the digital object would not be able to behind a real couch in the real world.
  • Assets in multi-plane detection are scaled appropriately in relationship to the established planes, though only need to be placed on them (via anchor points) when it causes them to function like their real-world counterparts.
  • Immersion can be broken by users interacting with AR objects as if they were physically real. Framing can be used to combat these immersion-breaking interactions.
  • Spatial mapping is the ability to create a 3D map of the environment and helps establish where assets can be placed.
  • Feature points are stationary and are used to further environmental understanding and place planes in an experience. ARCore assumes planes are unmoving, so it is inadvisable to attempt to anchor a digital object to a real world object that is in motion. In general, it’s best not to place an object until the room has been sufficiently mapped and static surfaces have been recognized and designated as feature points.

Ploy, a library of 3D assets for your AR app


Google's Poly is a 3D asset library that lets you quickly find 3D objects and scenes for use in your apps, and it was built from the ground up with AR and VR development in mind.

You can use VR creation tools from Google like Tilt Brush and Blocks to build 3D assets and store them on Poly for use in AR apps.

Poly allows you to browse, search, view and download thousands of objects and scenes for your project.

Poly is available on desktop, mobile, and in VR.

Poly can also create gifs and publish creations to the web.

The Poly API provides read access to assets in the Poly library.

With the Poly API, users can access Google’s growing collection of Creative Commons 3D assets and interact directly with Poly to search, download, and import objects dynamically across desktop, mobile, virtual reality, and augmented reality.

Creators can find all types of assets for applications and easily search for remixable, free assets licensed under a Creative Commons license by keyword, category, format, popularity or date uploaded.

Creators can filter by model complexity, or create a personalized experience by letting users sign into the app with their Google account to access any assets they’ve uploaded or liked on Poly.

Sceneform for easier AR content creation

Sceneform SDK is a high-level 3D framework that makes it easy for users to build AR apps in Java. It offers a new library for Android that enables the rapid creation and integration of AR experiences, and combines ARCore with a powerful physically-based 3D renderer. It includes a runtime API for working with graphics and rendering, and a plugin to help you import, preview, and tweak the look and feel of your assets directly in Android Studio.

Sceneform is highly optimized for mobile. Java developers can now build immersive, 3D apps without having to learn complicated APIs like OpenGL. They can use it to build AR apps from scratch as well as add AR features to existing ones.

What follows is a walkthrough of how to use Sceneform. It's more technically advanced than most of the other content in this course--it's very helpful to have a little background in Java to fully appreciate how you might use it yourself--but we've included it so that aspiring creators can start to learn how to use Sceneform to make their own AR content.

Using Ploy and Unity to create ARCore assets

Unity is a cross-platform game engine and development environment for both 3D and 2D interactive applications. It has a variety of tools, from the simple to the professionally complex, to allow for the streamlined creation of 3D objects and environments.
Poly toolkit for Unity is a plugin that allows you to import assets from Poly into Unity at edit time and at runtime.
Edit-time means manually downloading assets from Poly and importing them into your app's project while you are creating your app or experience.
Runtime means downloading assets from Poly when your app is running. This allows your app to leverage Poly's ever-expanding library of assets.


Introduction to Augmented Reality and ARCore (3) - Taking the next steps with ARCore

Links
  1. Introduction to augmented reality (AR)
  2. The basics of AR functionality


Taking the next steps with ARCore


Cloud Anchors for shared AR


As we’ve covered, anchors are the mechanism by which you can attach virtual content to a trackable real-world point.

Building on this concept and supported by the cloud, Cloud Anchors are a cross-platform feature that allow both iOS and Android device users to share the same AR experience despite using different underlying AR technologies. Where traditionally anchors have been isolated to one device—or one augmented reality—these anchors can be shared by multiple different devices simultaneously.

This makes AR possible for groups of people rather than just individuals, allowing for shared, collaborative experiences like redecorating your home, playing games and making art in 3D space together.


Use cases and current powers/limitations of AR

  • ARCore can be used to create dynamic experiences for businesses, nonprofits, healthcare, schools, and more.
  • ARCore’s strengths are its phone-based spatial mapping capabilities and addressable user base. Approximately 85% of phones around the world run on the Android operating system.
  • At of the beginning of 2018, ARCore is already available on 100 million Android-powered smartphones and that number continues growing. ARCore requires a lot of processing power, so not all older Android models have the necessary specifications yet. ARCore is also available in China.
  • Limitations to consider with contemporary AR technology include: low-light environments, a lack of featured surfaces, and the availability of powerful mobile processors in new phones.

User experience (UX) and user interface (UI) to consider

User interface (UI) and user experience (UX) are two complementary fields of product design. They have a lot in common, but generally speaking UX is the more technical of the two. UX is the process and underlying framework of enhancing user flow to create products with high usability and accessibility for end users while UI is more akin to graphic design and focuses on the front-end design elements of the interface.

At the most basic level, UI is the visuals of your app and everything that a user interacts with and UX is the process that supports the user's journey and interactions.

Before creating an AR app it is important to deeply consider the design so that you are utilizing the minimal space of a phone display in the best possible way. By creating the most intuitive UX/UI for your app, you give it the best chance to succeed with users.

When using an AR app your smartphone’s screen will mostly be filled with input from the camera, showing the live feed of the real world. It’s important to not overly clutter the screen with buttons or other elements that are nonessential and might be confusing for users.

For example, the AR Stickers app displays placeable 3D objects at the top of the screen, so you can drag and drop a character into your scene. The app also allows you to hide the objects menu by clicking the “ ^ “ button. This lets the user concentrate on the character without the additional noise of the objects menu.

If your app demands a lot of features, make sure the icons are displayed at the corners of the screen and don’t take up the crucial real estate in the middle of the screen. This is usually where most of the interaction with AR objects take place. Giving users the option to hide the menu is also a great feature to ensure immersion.

It is important not to crowd the screen with too much information at once. However, with AR you’re not restricted to just the screen when it comes to UI menus. One option that allows creators to incorporate a lot of information is to use the environment as a menu, letting users interact with items or display information tagged to specific features, such as buildings.

Basic AR interaction options

The following list offers some of the major examples of the types of interaction options AR app creators can give users. Treat this as a starting point rather than a complete list; mobile AR is young, and creators are discovering new interaction mechanics all the time.

  1. Drag and Drop: This feature lets users drag and drop objects from a menu of 3D digital assets “onto” the screen to place them on a real world surface (that has been spatially mapped by ARCore).
  2. Voice: Voice is quickly emerging as a powerful interaction tool that creators can build into AR apps. Building in pre-programmed voice commands allows users to execute specific actions within the AR app. This is often best achieved by embedding the Google Assistant SDK to add voice-enabled smart interaction
  3. Tap: With the tap mechanic, users can place objects in the real world by tapping on the screen. ARCore uses raycasting (the use of ray-to-surface as an intersection test), and projects a ray to help estimate where the AR object should be placed in order to appear in the real-world surface in a believable way. Another way to use tap is as a mechanic to interact with a digital object that is already placed in the scene. For example, maybe the app allows users to animate a 3D object when users tap on it.
  4. Pinch and Zoom: Pinch and zoom lets users enlarge or scale down a 3D object—or use the interaction in creative ways to build a game or user experience. For example, this could be used to pull back the string of a bow in a bow-and-arrow target game.
  5. Slide: Users can interact with 3D objects through sliding, which translates (or moves) objects in-scene, or use it as a game mechanic. For example, say you are creating an AR paper toss game. You could enable a slide interaction to let users project or throw papers into a trash can.
  6. Tilt: Using the accelerometer and gyroscope, a tilt of the phone can also be used as an input for creative interactions. An easy example would be to make a “steering wheel” mechanic for a racing tabletop AR game.

Think like a user

  • User flow is the journey of your app's users and how a person will engage, step by step, with your AR experience.
  • Planning your user flow needs to take into account the scene, the user interactions, any audio cues, and the final user actions.
  • A user flow can be created with simple sketches and panels all collected into one cohesive diagram.
  • UX and UI are complementary fields of product design, and generally speaking UX is the more technical of the two.
  • When considering UX/UI, one good rule of thumb to remember with AR is to avoid cluttering the screen with too many buttons or elements that might be confusing to users.
  • Choosing to use cartoonish designs or lighting can actually make the experience feel more realistic to the user, as opposed to photorealistic assets that fail to meet our expectations when they don't blend in perfectly with the real world.
  • Users might try to “break” your experience by deliberately disregarding your carefully planned user flow, but your resources are better spent on improving your app’s usability rather than trying to prevent bad actors.

Next steps on the AR journey

  • Advanced 3D design tools like Maya, Zbrush, Blender, and 3ds Max are powerful professional tools.
  • Google’s Poly can be a good starting resource for building your first ARCore experience.
  • Poly by Google is a repository of 3D assets that can be quickly downloaded and used in your ARCore experience.
  • The recommended guide for your AR experience is a design document that contains all of the 3D assets, sounds, and other design ideas for your team to implement.
  • You may need to hire advanced personnel to help you build your experience, such as: 3D artists, texture designers, level designers, sound designers, or other professionals.

Introduction to Augmented Reality and ARCore (2) - The basics of AR functionality

Links

1. Introduction to augmented reality (AR)


The basics of AR functionality


What makes AR feel real?


Enables: motion tracking for AR

Accelerometer: measures acceleration, which is change in speed divided by time. Simply put, it’s the measure of change in velocity. Acceleration forces can be static/continuous—like gravity—or dynamic, such as movement or vibrations.

Gyroscope: measures and/or maintains orientation and angular velocity. When you change the rotation of your phone while using an AR experience, the gyroscope measures that rotation and ARCore ensures that the digital assets respond correctly.

Phone Camera: with mobile AR, your phone camera supplies a live feed of the surrounding real world upon which AR content is overlaid. In addition to the camera itself, ARCore-capable phones like the Google Pixel rely on complementary technologies like machine learning, complex image processing, and computer vision to produce high-quality images and spatial maps for mobile AR.

Enables: location-based AR

Magnetometer: gives smartphones a simple orientation related to the Earth's magnetic field. Because of the magnetometer, your phone always knows which direction is North, allowing it to auto-rotate digital maps depending on your physical orientation. This device is key to location-based AR apps.

GPS: a global navigation satellite system that provides geolocation and time information to a GPS receiver, like in your smartphone. For ARCore-capable smartphones, this device helps enable location-based AR apps.

Enables: view of real world with AR

Display: the display on your smartphone is important for crisp imagery and displaying 3D rendered assets. For instance, Google Pixel XL’s display specification is 5.5" AMOLED QHD (2560 x 1440) 534ppi display, which means that the phone can display 534 pixels per inch—making for rich, vivid images.

In order to seem real, an AR object has to act like its equivalent in the real world. Immersion is the sense that digital objects belong in the real world.
Breaking immersion means that the sense of realism has been broken; in AR this is usually by an object behaving in a way that does not match our expectations.
Placing is when the tracking of a digital object is fixed, or anchored, to a certain point in the real world.
Scaling is when a placed AR object changes size and/or dimension relative to the AR device's position. For example, when a user moves away or towards an AR object, it feels like the object is getting larger or smaller depending on the distance of the phone in relation to the object. AR objects further away from the phone look smaller and objects that are closer look larger. This should mimic the depth perception of human eyes.
Occlusion occurs when one object blocks another object from view.
AR software and hardware need to maintain “context awareness” by tracking the physical objects in any given space and understanding their relationships to each other -- i.e. which ones are taller, shorter, further away, etc.

The magic of AR: detecting, sensing, and understanding

  • There are two basic ways to track the position and orientation of a device or user: outside-in tracking and inside-out tracking.
  • Outside-in tracking uses external cameras or sensors to detect motion and track positioning. This method offers more precision tracking, but a drawback is the external sensors lower the portability.
  • Inside-out tracking uses cameras or sensors located within the device itself to track its position in the real world space. This method requires more hardware in the AR device, but offers more portability.
  • On the AR headset side, the Microsoft HoloLens is a device that uses inside-out tracking. On the VR headset side, the HTC Vive is a device that uses outside-in tracking.
  • On the AR mobile side, the Google Pixel is a smartphone that uses inside-out tracking for AR.

Inside ARCore: the building blocks of augumented reality

  • ARCore integrates virtual content with the real world as seen through your phone's camera and shown on your phone's display with technologies like motion tracking, environmental understanding, and light estimation.
  • Motion tracking uses your phone's camera, internal gyroscope, and accelerometer to estimate its pose in 3D space in real time.
  • Environmental understanding is the process by which ARCore “recognizes” objects in your environment and uses that information to properly place and orient digital objects. This allows the phone to detect the size and location of flat horizontal surfaces like the ground or a coffee table.
  • Light estimation in ARCore is a process that uses the phone’s cameras to determine how to realistically match the lighting of digital objects to the real world’s lighting, making them more believable within the augmented scene.
  • Feature points are visually distinct features in your environment, like the edge of a chair, a light switch on a wall, the corner of a rug, or anything else that is likely to stay visible and consistently placed in your environment.
  • Concurrent odometry and mapping (COM) is a motion tracking process for ARCore, and tracks the smartphone’s location in relation to its surrounding world.
  • Plane finding is the smartphone-specific process by which ARCore determines where surfaces are in your environment and uses those surfaces to place and orient digital objects. ARCore looks for clusters of feature points that appear to lie on common horizontal or vertical surfaces, like tables or walls, and makes these surfaces available to your app as planes. ARCore can also determine each plane's boundary and make that information available to your app. You can use this information to place virtual objects resting on flat surfaces.
  • Anchors “hold” the objects in their specified location after a user has placed them.
  • Motion tracking is not perfect. As you walk around, error, referred to as drift, may accumulate, and the device's pose may not reflect where you actually are. Anchors allow the underlying system to correct that error by indicating which points are important.

The current challenges facing AR today

  • Currently AR has a lack of user interface metaphors, meaning that a commonly understood method or language of human interaction has not been established.
  • The purpose of the interface metaphor is to give the user instantaneous knowledge about how to interact with the user interface. An example is a QWERTY keyboard or a computer mouse.
  • The details of what makes AR challenging from a technical standpoint are complex, but three influential factors are power, heat, and size.
  • AR requires high processing power, batteries generate heat, and a current challenge is fitting all the necessary components into a small enough form factor to wear on your face comfortably for extended periods of time.
  • Not everything in AR has to be 3D, but the vast majority of assets, applications, and experiences will require at least a little 3D design.
  • Currently, there is a limited base of people with 3D design and interaction skills, such as professional animators, graphic designers, mechanical engineers, or video game creators. For AR to grow, the adoption of 3D design theory, skills, and language needs to become much more widespread. Later on in this course, we’ll be discussing a few programs that are helping overcome this challenge, like Sceneform or Poly API.
  • Computer vision is a blend of artificial intelligence and computer science that aims to enable computers (like smartphones) to visually understand the surrounding world like human vision does. This technology needs to improve in terms of object detection and segmentation to make AR processes more effective.

Introduction to Augmented Reality and ARCore (1) - Introduction to augmented reality (AR)

사내에서 AR 스터디를 하는데, 코세라에서 조금 더 공부를 해 보고자 큰 맘 먹고 29$ 결제를 해서 듣게 되었다. ARCore는 부수적인 내용이고 사실 AR이 뭔지에 대한 이해 그리고 AR을 구현하려면 어떤 지식이 필요한지에 대해 이해를 하는 강의라고 보면 된다.

개발자인데 AR에 대해서 아무것도 모른다 싶으면 아래 Google AR VR 유튜브 채널에 가서 동영상 구경 몇 개 해 보면 뭔지 감이 올 것이다.

https://www.youtube.com/channel/UCkUZagbGbewp3bZQLXGzkmA

AR도 공부를 하다 보면 Unity 깔고 C# 코딩 부터 시작한다기 보다 optical & video 즉 computer vision을 공부한다는 느낌이 더 강하다. 마치 ML을 배우기 위해 python 먼저 코딩하는 것 보다는, math, data analysis, data modeling, data visualization 등을 이해하는게 더 중요한 것처럼.

강의 video는 완료하면 다시 볼 수 없으므로 각 주제별 summary text를 긁어서 정리를 해 볼까 한다.

해당 강의 링크
https://www.coursera.org/learn/ar

강의는 무료로 볼 수 있으며, 수료증이 필요하면 결제해야 한다.

Overview

This class will teach you the fundamentals of augmented reality (AR), and how to build an AR experience using ARCore. Through the four week course, you'll learn:

- How to identify different types of AR experiences
- Tools and platforms used in the AR landscape
- What makes AR feel "real"
- Popular use cases for AR
- How to create an AR use flow
- How AR experiences work
- Tools like Google Poly and Unity to build AR experiences
- Next steps to start building an AR experience using ARCore and other tools

This course will break down complex AR concepts to make them easy to understand, while also sharing expert tips and knowledge from Daydream's ARCore team. The course is great for beginners who are just getting started with AR or ARCore.

Introduction to augmented reality (AR)

What is AR?

Now that you’ve gained a basic understanding of what augmented reality is, it’s important to understand how it differs from VR.

The most obvious difference is in the hardware itself. A VR experience must be viewed in some kind of headset, whether it’s powered by a smartphone or connected to a high-end PC. VR headsets require powerful, low-latency displays capable of projecting complete digital worlds without dropping a frame. AR technology does not share this requirement. You can hold up your phone and have a headset-free AR experience any time.

Augmented reality is direct or indirect live view of a physical, real-world environment whose elements are "augmented" by computer-generated perceptual information. Virtual reality is the use of computer technology to create a simulated environment, placing the user inside an experience.

Both technologies enable us to experience computing more like we experience the real world; they make computing work more like we do in regular life-- in a 3D space. In terms of how the two technologies are used, think of it like this. VR transports you to a new experience. You don’t just get to see a place, you feel what it’s like to be there. AR brings computing into your world, letting you interact with digital objects and information in your environment.

Generally speaking, this difference makes AR a better medium for day-to-day applications, because users don’t have to shut out the world to engage with them.

  • Humankind’s first foray into immersive reality through a head-mounted display was the “Sword of Damocles,” created by Ivan Sutherland in 1968.
  • HMD is the acronym for “head-mounted display.”
  • The term “Augmented Reality” was coined by two Boeing researchers in 1992.
  • A standalone headset is a VR or AR headset that does not require external processors, memory, or power.
  • Through the combination of their hardware and software, many smartphones can view AR experiences that are less immersive than HMDs.
  • Many of the components in smartphones—gyroscopes, cameras, accelerometers, miniaturized high-resolution displays—are also necessary for AR and VR headsets.
  • The high demand for smartphones has driven the mass production of these components, resulting in greater hardware innovations and decreases in costs.
  • Project Tango was an early AR experiment from Google, utilizing a combination of custom software and hardware innovations that lead to a phone with depth-sensing cameras and powerful processors to enable high fidelity AR.
  • An evolution of Project Tango, ARCore is Google’s platform for building augmented reality experiences.

Types of AR experiences

  • Shopping and retail
  • Business
  • Social media
  • Gaming
  • Education
  • Healthcare
  • Nonprofits

Monday, July 22, 2019

"책 추천해 주세요"에 대답해 주고 싶은 말

이건 광화문 교보문고를 들렀다가 있었던 일을 생각해 봄과 동시에 평소에 커뮤니티 사이트나 개인적인 질문 중 많은 질문 중 하나인 "책 추천 부탁"에 대한 내 생각을 좀 써보려 한다.

7월 초쯤 아내가 논문 쓸 책 참고하겠다며 광화문 교보문고에 간다길래 나도 요즘 나오는 컴퓨터 관련 책이 뭐가 있나 볼겸 갔다.

<광화문 교보문고 사진, 요즘 서점은 책만 잔뜩 있는 곳이 아니라 북까페로 인테리어가 되어 있다.
출처: 교보문고 사이트>

python, c++ 관련 책들이 있는 책장에서 못보던 책이 뭐 있나 살펴보던 중, 아무리 봐도 대학생으로 보이는 여학생 둘이 python이 낫냐 C가 낫냐 따져가면서 책을 골라보고 있었다. 사실 무슨 책을 고르는지는 관심이 없었는데 만약 학생이라면 언어를 정하고 그 중에 어떤 책이 좋냐를 고를 줄 알았는데, 골라야 하는 언어가 다양하다 보니 컴퓨터 관련 학과 학생이 맞나? 라는 궁금증이 들기 시작했다.

그래서 용기를 내서 여러 프로그래밍 언어 책을 고르는 이유를 물어봤는데 컴퓨터 전공 학생들은 아니었고 정보통신? 정보보호? 과라고 해서 프로그래밍을 배워서 해야 하는 과제가 있어서 책 보러 왔다고 했다. 그런데 내가 나이가 좀 많이 들어 보여서 그런지 이 여학생들이 나를 보자마자 한다는 얘기가 "교수님이세요?" 라고 하더라. 서점에서 무슨 책 보는지 물어봤을 뿐인데, 교수님으로 보였나 보다.

그래서 개발자고 학생들 멘토링 해주는 일도 하고 있다고 하니까 C언어 중에 좋은 책을 추천해 달라고 했다.

사실 그 학생들에게는 아무 책이나 추천해 줘도 상관 없다고 생각한다.
뭐라고요? 아무 책이나 추천해 준다고요? 왜 그런 성의없는 짓을 하죠? 라고 생각하는 사람도 있을 것이다. 왜냐하면 그 학생들에게는 좋은 책이 중요한게 아니라 어떤 책을 골랐던 공부를 하는데 얼마나 시간을 투자해서 의미있는 시간을 보냈고, 그 책을 통해서 얻은 것이 무엇이냐가 더 중요하기 때문이다.

물론 추천해 준 책은 시중에서 잘 나간다는 책 몇 권을 골라주기는 했다. 그런데 그걸 골라가지는 않더라. 왜냐하면 자신들도 더 판단해 보고 싶었기 때문이겠지. 이미 자신들도 알고 있는 것이다. 남이 추천해준 책이 좋을 수 있지만 내가 좋아보이는 책을 고르는게 더 맞다는 생각이겠지.

여기서 부터 나의 얘기를 시작해 보려 한다.

나는 누군가에게 조언을 해주는 일을 하고 있기도 하고, 좋은 책을 추천해 달라고 하면 알려 줄 수도 있는 선배 개발자의 위치에 있기도 하다.

그런데 나는 책을 추천해 달라고 하면 추천해 주지 않는다는 얘기부터 한다. 그리고 아래와 같은 사족을 단다.

  • 책을 딱 한권만 읽고 말거면 추천해 줄 수 있지만, 같은 주제로 다양한 책들이 너무 많기 때문에 추천이 어렵다.
  • 또 나한테 좋은 책과 너한테 좋은 책은 상대적인 기준이 다를 수 있으므로 섣불리 추천해 주지 않는다.
  • 그리고 책 만족도에 있어서도 추천해 준 책을 보고 후회를 하던 자신이 고른 책을 보고 후회하던 후회의 강도가 같을 것 같지만, 남이 추천해 준 책을 보고 후회를 한다면 후회를 넘어 추천해 준 사람에 대한 근본없는 분노를 느끼기 충분하기 때문에 추천해 주지 않는다.
그리고 더 나아가서 대뜸 책 추천을 해달라고 서슴없이 물어보는 친구들이 있는데, 그 친구들이 뭘 하고 있는지를 좀 살펴보고 나서 드는 생각은 조금 더 관심을 가지고 조사해 봤으면 좋겠다는 생각이다.

왜 그러냐 하면, 자신이 이 분야에 관심이 있고 공부를 하고 싶고 잘 알아서 지식을 쌓거나 회사 취업을 위해 준비하거나 업무를 더 잘하려는 마음 때문에 책을 보고 싶어 할 것인데 뭔가 좀 찾아보고 본인이 판단해 봐도 될 것을 남의 판단에 의지해 묻어가려는 심리가 있어 보이기 때문에 그렇다.

정말 아무것도 몰라서 가르침을 받고 싶은 심정 또한 이해가 가긴 하지만, 자신도 뭔가 알아보고 난 후에 책을 추천해 줄 사람과 "communication"이라는게 됐으면 좋겠는데 대충 흘러가는 식이
  • XXX 책 추천해 주세요
  • YYY 책 추천 드립니다.
  • 감사합니다.
이렇다 보니 질문하는 사람도 뭔가 검증 절차도 없이 넙죽 받아먹기만 하고, 대답해 주는 사람도 자신의 관점에서 별다른 이유 없이 추천해 주는 식이다 보니 뭔가 모양새가 좋지 않다는 것도 내 생각이다.

책 추천에 대한 질문과 더불어 책을 추천받고 싶은 이유, 내가 어떤 분야에 뭔가를 하고 있다는 내용, 그래서 뭘 더 하고 싶은지에 대한 내용이 있으면 "communication"을 위한 시작이 좋다고 보고 싶다. 당장 책 추천 질문에 자신의 상태와 책을 보고 싶은 이유에 대해 장황하게 쓰는게 귀찮고 어렵다면 미리 자신의 블로그나 github 관련 페이지를 준비하고 링크를 걸어 주는 정도로만 해도 충분할 것이다.

이렇게 해서 질문을 해야 추천해 줄 사람도 그 사람의 history를 파악하고 책을 추천해 줄 수 있을 것이고, 책 추천해줄 사람이 조금 더 질문을 하는 식으로 해서 정말 필요한 공부가 무엇인지도 얘기해 줄 수 있다. 비록 책 추천이 아니더라도 더 좋은 얘기를 들을 수 있는 확률이 높다는 얘기다.

하지만 정 그렇게 준비하는게 귀찮고 그냥 책만 추천 받아 보고 싶으면 자신보다 더 잘알지 잘모를지도 모를 사람들에게 질문하고 답변 해 주면 감사하다고 하며 의지하지 말고, 본인의 검색 능력과 인터넷 글의 평점, 후기 등을 잘 읽고 판단해서 스스로 책을 고르는 주체적인 사람이 되어 보도록 하자.

--- 추가 내용

사실 책 추천의 수준이 python, android, javascript 등 특정 프로그래밍 언어 공부를 할 수 있는 책이다 보니 추천의 어려움을 넘어서 추천을 해 주고 싶지 않다는 생각이 더 많이 든다. 이런 책은 수십권 아니 수백권이나 되기 때문에 정말 자신에게 맞는 책을 보고 열심히 공부한다면 좋은 책을 골랐던 아니던 그 차이가 크게 없을 것이라는게 나의 생각이다.

혹시 그럴수도 있다. 한가지 주제를 정해서 추천해 달라고 했을 때 유일하게 추천(?) 수준이 아니라 볼 책이 그 책 밖에 없을 때. 과연 그런책이 존재하기는 할까 싶기도 할 것이다. 그런데 그런 책이 존재할까? 놀랍게도 있다.

자 만약 "프로그래밍이라는 행위에 대한 고찰과 그 프로그래머의 심리에 대해 나와 있는 책을 추천해 주세요" 라고 한다면 아마 아래 소개한 책이 유일할 것이다.

프로그래밍 심리학 - 10점
제럴드 M. 와인버그 지음, 조상민 옮김/인사이트

하지만 여태까지 이런 주제로 책을 추천해 달라고 했던 사람은 아무도 없었으며, 사실 추천해 달라고 하기 전에 검색해봤을 테니 이 책밖에 없다는 사실을 알고 있다면 추천해 달라는 질문 자체가 무의미할 것이다.