On Wednesday, Google introduced its take on the eyeglasses of the future via its Google+ platform, called Project Glass. Using pieces of smart glass with a heads-up display (HUD), Google's wearable device mixes communication technology like social networking, calling and texting with real-world elements like people, places and things. Yet, Google's high-tech glasses don't come without a fair share of flaws.

We think technology should work for you, to be there when you need it and get out of your way when you don't, Google said in a Google+ post. A group of us from Google[x] started Project Glass to build this kind of technology, one that helps you explore and share your world, putting you back in the moment. We're sharing this information now because we want to start a conversation and learn from your valuable input. So we took a few design photos to show what this technology could look like and created a video to demonstrate what it might enable you to do.

It's a fascinating piece of consumer tech when it all comes together, but there are a few troubling aspects of Project Glass that hold it back from being an insanely great device. Here are five of the biggest problems with Project Glass:

1. Filtering messages. What happens when you receive two texts, a phone call, and approach a location simultaneously? How many icons will be in front of your eyes, and how can you control these icons so they don't distract you from what you're doing? While it doesn't seem like there's a way to limit which applications you want, this will need to be an option if Google hopes to make its glasses successful.

In Google's introductory video, the glasses wearer always accepts whatever notifications he receives, from friends, loved ones, and even an alert from Google Maps. But how do you control these, or turn them off? We don't always feel like talking on the phone whenever Mom calls. How do we avoid that awkward moment when that awkward guy from work sends a text about a party? If you accidentally read the text and say, Oh s**t out loud, will Project Glass dictate your text and send it away? At least Siri asks you if you're sure about what you're sending. Google needs to work out the ways to filter through messages we want to respond to, and more importantly, the ones we don't.

2. Multi-tasking. Smartphones, tablets and computers are constantly attempting to take our attention away from the real world. Google's Project Glass may be the first piece of technology that actually wants to keep you in the moment, but will it?

Google's glasses introduce yet another screen on top of computers, smartphones, tablets and gaming devices. But this screen is right in front of your eyes, so its messages will be very difficult to ignore. Since the glasses can track your eye movements, it may accidentally perform an action when your eyes are looking at an object, perhaps in the sky since the icons appear when you look up.

Owners of these glasses will be placing a lot of trust in Google, but the ultimate question is, can Google be trusted with your attention? If too many pop-ups or notifications occur, the attention required for these glasses could tear us apart, rather than bring us together.

3. Limited sharing networks. Can we only share with Google+? Really? Google desperately wants its network to succeed, but forcing users to exclusively rely on its fledgling social network could be disastrous. We don't necessarily need to post to Facebook every other minute, but adding Twitter would be a tremendous move. Now that Google has officially introduced Project Glass to the public, maybe it can invite other social networks to join the party inside your glasses.

4. Learning context. At any given time, you could be talking on the phone, cooking food and talking to your friends in the room all at once. How will Google's glasses know which activity you want to focus on most? If you're on the phone and you want to shout to someone in the room, how can you mute the receiver on your glasses? Google's concept device borders on sensory overload, so it'll need to find ways to pick up contextual clues in order to decipher who your message is intended for.

5. Internet reliant. Project Glass comes with plenty of incredible features, but how many of these features require an adequate Internet connection?

In the video, icons are displayed in a virtual row, including applications for the date, time, temperature, as well as options for music, texting, calling, Google+, taking pictures, reminders, searching (the Web, presumably), voice dictation, and location-based services. Unfortunately, at least six of those major services will need a Wi-Fi connection to work. In the same way Siri depends on the Internet to perform, these glasses could be spectacular, so long as you never leave a Wi-Fi-friendly zone.

Then imagine going to the airport. Will users have to pay by the minute or hour to use their Google glasses in the terminal? Let's hope not.

How Project Glass Works

When a user dons Google glasses, the HUD senses its owner and immediately activates, showing the various icons and applications for a brief moment, in the same way the old Macintosh computer would display logos of each piece of software as it was booting up. To activate applications, the owner simply looks up and finds the icon they want along a virtual row, and Google's eye-tracking software detects which application the user wants to activate. In the video, the owner says hm, which could potentially be a secondary vocal cue to activate them.

When the owner receives a notification, such as a text from a friend, he looks at it (or utters another hm) to open the message. To respond, he says, um to begin a recording feature, and he speaks his message as it's dictated directly in front of his eyes.

Project Glass also takes advantage of location-based services to give the user information about her location. In Google's video, the owner approaches a subway station, only to receive a notification on her glasses that the train's service has been suspended. Instead, the owner activates an alternative walking route (instead of a bus or transit option) with her eyes, which then shows her arrows for which way to go via GPS navigation. Since Google Maps also works indoors, the user can also find regions within a store, like the fiction section of a bookstore.

The smart eyeglasses can also pick up on location-based services that others use, such as Foursquare. If you want to know if your friend is in your area, you can ask your glasses, which give you an approximate distance (in feet or miles) to that person. And speaking of Foursquare, if you want to check in to a given location, you need only look up in or around that location, choose the Location application, and check in.

The user can also set reminders with visual cues, which will surely make Apple's Siri engineers irate. Just by looking at object and saying, Remind me to ..., a user can set reminders for anything at any time.

One of the best features of Google's glasses is something sci-fi fans and writers could have only dreamed of: Taking pictures with your eyes. If you happen to find a picturesque scene in your travels, you don't need to fumble for a camera. As long as you're wearing the glasses, you only need to say, Take a picture of this, and your glasses become your camera lens. Just adjust, snap, and either share it to your Circles (there's Google+ again), or delete it. In the department of capturing moments instantaneously, Google's Project Glass even one-ups the Lytro camera.

Then, there's chatting. When a user receives an incoming chat notification, he can choose to either just talk (like a normal phone call), or go into a video chat that shows his friend whatever he is looking at through their Google glasses. This opens the door to tremendous possibilities for sharing what one does on a daily basis.

Google did not say when these glasses were coming, how far along development is, or how much these babies will cost, but assume the finished product is at least a year or two away. Watch the video, and tell us your impressions in the comments section below.