Jelly Bean Android Die besten Tricks und Kniffe für Jelly Bean
Android "Jelly Bean" ist der Codename für die zehnte Version des von Google entwickelten mobilen Android-Betriebssystems, die drei Hauptversionen umfasst. Zu den Geräten, auf denen Android ausgeführt wird, gehört das Asus Nexus 7. ↑ Android , Jelly Bean. Archiviert vom Original am Oktober ; abgerufen am September Jelly Bean war der Name für die Versionen , und des Google-Betriebssystems Android. Eingeschränkte Profile für Tablets. Jetzt kannst du zu Hause und an deinem Arbeitsplatz den Zugriff auf Apps und Inhalte einschränken. Eltern können so. Künftig benötigt der Google-Browser mindestens Android KitKat. Android bis Jelly Bean läuft derzeit noch auf 3,2 Prozent aller.
Wer eine ältere Android-Version als nutzt, muss mit Sicherheitslücken im Browser leben, die Google nicht schließen will – und selbst die. Viele Nutzer warten noch auf ICS. Andere dürfen sich schon auf das Jelly-Bean-Update freuen. Ob Ihr Gerät dabei ist, sehen Sie in dieser. Künftig benötigt der Google-Browser mindestens Android KitKat. Android bis Jelly Bean läuft derzeit noch auf 3,2 Prozent aller.
Jelly Bean Android VideoAndroid Jellybean (Cyanogenmod 10.2) ringtones A new layout mode lets you manage the positioning of Views inside ViewGroups according to their optical boundsrather than their clip bounds. Among the click here that run Android 4. You can use all of the normal tools to create a UI and render content in the Presentation, from building an arbitrary view hierarchy to using SurfaceView or SurfaceTexture to draw directly into the window for streamed content or camera previews. You can set window properties to enable jump-cutcross-fadeor standard window read article. Fujifilm szaleje! This is useful for apps using the camera that want click here disable rotation while shooting video. On October 29,Google unveiled Android 4. Retrieved June 28, History Outline List of products List of mergers and acquisitions. Jelly Bean versions are no longer supported by Google. Support for Bluetooth Smart Ready is already available on Nexus 7 and Nexus 4 devices and will be supported in source growing number of Android-compatible devices in the months ahead. GridLayout lets you structure the content of your remote views and manage child views alignments with a shallower UI hierarchy.
Bluetooth Smart minimizes power use while measuring and transmitting data for fitness sensors like Fitbit, Runtastic and other devices, making your phone or tablet more power efficient.
Just start touching numbers or letters and the dial pad will automatically suggest numbers or names.
Check them out. Virtual surround sound - enjoy movies from Google Play with surround sound on Nexus 7 edition and other Nexus devices.
OpenGL ES 3. Easier text input - an improved algorithm for tap-typing recognition makes text input easier.
Bluetooth Smart support a. Restricted profiles - put your tablet into a mode with limited access to apps and content.
Retrieved August 15, Retrieved September 3, The NExt Web. Retrieved June 28, The Verge. Archived from the original on April 30, Retrieved July 3, Retrieved July 25, Retrieved January 26, Retrieved December 1, Retrieved November 2, Retrieved July 6, Archived from the original on January 19, Retrieved July 8, CBS Interactive.
Android developer portal. Archived from the original on September 27, Retrieved September 29, Retrieved August 31, Purch Group. Retrieved March 6, Android operating system.
Cardboard Daydream. Android Go Android One other smartphones. Index of articles Androidland Chromecast Google Java vs. On supported devices, apps can use a new HDR camera scene mode to capture an image using high dynamic range imaging techniques.
Additionally, the framework now provides an API to let apps check whether the camera shutter sound can be disabled.
Apps can then let the user disable the sound or choose an alternative sound in place of the standard shutter sound, which is recommended.
Filterscript is a subset of Renderscript that is focused on optimized image processing across a broad range of device chipsets.
Filterscript is ideal for hardware-accelerating simple image-processing and computation operations such as those that might be written for OpenGL ES fragment shaders.
Because it places a relaxed set of constraints on hardware, your operations are optimized and accelerated on more types of device chipsets.
Any app targeting API level 17 or higher can make use of Filterscript. Intrinsics are available for blends, blur, color matrix, 3x3 and 5x5 convolve, per-channel lookup table, and converting an Android YUV buffer to RGB.
You can now create groups of Renderscript scripts and execute them all with a single call as though they were part of a single script.
This allows Renderscript to optimize execution of the scripts in ways that it could not do if the scripts were executed individually.
Renderscript image-processing benchmarks run on different Android platform versions Android 4. If you have a directed acyclic graph of Renderscript operations to run, you can use a builder class to create a script group defining the operations.
At execution time, Renderscript optimizes the run order and the connections between these operations for best performance.
When you use Renderscript for computation operations, you apps benefit from ongoing performance and optimization improvements in the Renderscript engine itself, without any impact on your app code or any need for recompilation.
As optimization improves, your operations execute faster and on more chipsets, without any work on your part.
The chart at right highlights the performance gain delivered by ongoing Renderscript optimization improvements across successive versions of the Android platform.
Renderscript Compute is the first computation platform ported to run directly on a mobile device GPU. It now automatically takes advantage of GPU computation resources whenver possible to improve performance.
With GPU integration, even the most complex computations for graphics or image processing can execute with dramatically improved performance.
Any app using Renderscript on a supported device can benefit immediately from this GPU integration, without recompiling. The Nexus 10 tablet is the first device to support this integration.
The Android 4. The new options expose features for debugging and profiling your app from any device or emulator.
New developer options give you more ways to profile and debug on a device. In most cases, the new platform technologies and enhancements do not directly affect your apps, so you can benefit from them without any modification.
Every Android release includes dozens of security enhancements to protect users. Here are some of the enhancements in Android 4.
These improvements depend on hardware support — devices that offer these low-latency audio features can advertise their support to apps through a hardware feature constant.
New AudioManager APIs are provided to query the native audio sample rate and buffer size, for use on devices which claim this feature.
The Dalvik runtime includes enhancements for performance and security across a wider range of architectures:.
Find out more about the Jelly Bean features for users at www. To ensure a consistent framerate, Android 4.
This results in a more reactive and uniform touch response. Tooling can help you get the absolute best performance out of your apps.
The data is represented as a group of vertically stacked time series graphs, to help isolate rendering interruptions and other issues.
New APIs for accessibility services let you handle gestures and manage accessibility focus as the user moves through the on-screen elements and navigation buttons using accessibility gestures, accessories, and other input.
The Talkback system and explore-by-touch are redesigned to use accessibility focus for easier use and offer a complete set of APIs for developers.
Accessibility services can link their own tutorials into the Accessibility settings, to help users configure and use their services.
Apps that use standard View components inherit support for the new accessibility features automatically, without any changes in their code.
Apps that use custom Views can use new accessibility node APIs to indicate the parts of the View that are of interest to accessibility services.
Apps can display text or handle text editing in left-to-right or right-to-left scripts. Apps can make use of new Arabic and Hebrew locales and associated fonts.
The platform now supports user-installable keyboard maps , such as for additional international keyboards and special layout types.
By default, Android 4. When users connect a keyboard, they can go to the Settings app and select one or more keymaps that they want to use for that keyboard.
When typing, users can switch between keymaps using a shortcut ctrl-space. You can create an app to publish additional keymaps to the system.
The APK would include the keyboard layout resources in it, based on standard Android keymap format. Developers can create custom notification styles like those shown in the examples above to display rich content and actions.
Notifications have long been a unique and popular feature on Android. Apps can now display larger, richer notifications to users that can be expanded and collapsed with a pinch or swipe.
Notifications support new types of content , including photos, have configurable priority, and can even include multiple actions.
Through an improved notification builder , apps can create notifications that use a larger area, up to dp in height. Three templated notification styles are available:.
In addition to the templated styles, you can create your own notification styles using any remote View. Apps can add up to three actions to a notification, which are displayed below the notification content.
The actions let the users respond directly to the information in the notification in alternative ways. With expandable notifications, apps can give more information to the user, effortlessly and on demand.
Users remain in control and can long-press any notification to get information about the sender and optionally disable further notifications from the app.
App Widgets can resize automatically to fit the home screen and load different content as their sizes change. New App Widget APIs let you take advantage of this to optimize your app widget content as the size of widgets changes.
For example, a widget could display larger, richer graphics or additional functionality or options.
Developers can still maintain control over maximum and minimum sizes and can update other widget options whenever needed. You can also supply separate landscape and portrait layouts for your widgets, which the system inflates as appropriate when the screen orientation changes.
App widgets can now be displayed in third party launchers and other host apps through a new bind Intent AppWidgetManager.
At run time, as Activities are launched, the system extracts the Up navigation tree from the manifest file and automatically creates the Up affordance navigation in the action bar.
Developers who declare Up navigation in the manifest no longer need to manage navigation by callback at run time, although they can also do so if needed.
Also available is a new TaskStackBuilder class that lets you quickly put together a synthetic task stack to start immediately or to use when an Activity is launched from a PendingIntent.
Creating a synthetic task stack is especially useful when users launch Activities from remote views, such as from Home screen widgets and notifications, because it lets the developer provide a managed, consistent experience on Back navigation.
You can use a new helper class, ActivityOptions , to create and control the animation displayed when you launch your Activities.
Through the helper class, you can specify custom animation resources to be used when the activity is launched, or request new zoom animations that start from any rectangle you specify on screen and that optionally include a thumbnail bitmap.
New system UI flags in View let you to cleanly transition from a normal application UI with action bar, navigation bar, and system bar visible , to "lights out mode" with status bar and action bar hidden and navigation bar dimmed or "full screen mode" with status bar, action bar, and navigation bar all hidden.
GridLayout lets you structure the content of your remote views and manage child views alignments with a shallower UI hierarchy. ViewStub is an invisible, zero-sized View that can be used to lazily inflate layout resources at runtime.
From the preview, users can directly load the Live Wallpaper. With Android 4. Apps can store and retrieve contact photos at that size or use any other size needed.
The maximum photo size supported on specific devices may vary, so apps should query the built-in contacts provider at run time to obtain the max size for the current device.
Apps can register to be notified when any new input devices are attached, by USB, Bluetooth, or any other connection type.
They can use this information to change state or capabilities as needed. For example, a game could receive notification that a new keyboard or joystick is attached, indicating the presence of a new player.
Apps can query the device manager to enumerate all of the input devices currently attached and learn about the capabilities of each.
Among other capabilities, apps can now make use of any vibrator service associated with an attached input device, such as for Rumble Pak controllers.
Extending vsync across the Android framework leads to a more consistent framerate and a smooth, steady UI.
So that apps also benefit, Android 4. This lets them optimize operations on the UI thread and provides a stable timebase for synchronization.
The animation framework now uses vsync timing to automatically handle synchronization across animators.
For specialized uses, apps can access vsync timing through APIs exposed by a new Choreographer class.
Apps can request invalidation on the next vsync frame — a good way to schedule animation when the app is not using the animation framework.
For more advanced uses, apps can post a callback that the Choreographer class will run on the next frame. The animation framework now lets you define start and end actions to take when running ViewPropertyAnimator animations, to help synchronize them with other animations or actions in the application.
The action can run any runnable object. For example, the runnable might specify another animation to start when the previous one finishes. You can also now specify that a ViewPropertyAnimator use a layer during the course of its animation.
Previously, it was a best practice to animate complicated views by setting up a layer prior to starting an animation and then handling an onAnimationEnd event to remove the layer when the animation finishes.
Now, the withLayer method on ViewPropertyAnimator simplifies this process with a single method call. A new transition type in LayoutTransition enables you to automate animations in response to all layout changes in a ViewGroup.
When the user triggers a transfer, Android Beam hands over from NFC to Bluetooth, making it really easy to manage the transfer of a file from one device to another.
Developers can take advantage of Wi-Fi network service discovery to build cross-platform or multiplayer games and application experiences.
Using the service discovery API, apps can create and register any kind of service, for any other NSD-enabled device to discover.
The service is advertised by multicast across the network using a human-readable string identifier, which lets user more easily identify the type of service.
Consumer devices can use the API to scan and discover services available from devices connected to the local Wi-Fi network.
After discovery, apps can use the API to resolve the service to an IP address and port through which it can establish a socket connection.
You can take advantage of this API to build new features into your apps. For example, you could let users connect to a webcam, a printer, or an app on another mobile device that supports Wi-Fi peer-to-peer connections.
Wi-Fi P2P is an ideal way to share media, photos, files and other types of data and sessions, even where there is no cell network or Wi-Fi available.
Pre-associated service discovery lets your apps get more useful information from nearby devices about the services they support, before they attempt to connect.
Apps can initiate discovery for a specific service and filter the list of discovered devices to those that actually support the target service or application.
On the other hand, your app can advertise the service it provides to other devices, which can discover it and then negotiate a connection.
This greatly simplifies discovery and pairing for users and lets apps take advantage of Wi-Fi P2P more effectively.
With Wi-Fi P2P service discovery, you can create apps and multiplayer games that can share photos, videos, gameplay, scores, or almost anything else — all without requiring any Internet or mobile network.
Your users can connect using only a direct p2p connection, which avoids using mobile bandwidth. Apps can query whether the current network is metered before beginning a large download that might otherwise be relatively expensive to the user.
Huawei wprowadza do Polski nowy wariant MateBooka Jaka cena? Fujifilm szaleje! Kiedy premiera? Nikon Z5 w drodze.
Olympus sprzedany. Co dalej z aparatami? Najlepsze roboty kuchenne. Transforming Tech: Why Ireland is the place to be for the tech industry.
GSM maniaK. APP maniaK. MOBI maniaK. FOTO maniaK. Explore these carousel items. Use the previous and next buttons, as well as the keyboard arrows, to change the displayed item.
An even sweeter Jelly Bean Jelly Bean 4. Restricted profiles for tablets You can now limit access to apps and content at home and work.
Bluetooth Smart support Bluetooth Smart minimizes power use while measuring and transmitting data for fitness sensors like Fitbit, Runtastic and other devices, making your phone or tablet more power efficient.
Dial pad autocomplete Just start touching numbers or letters and the dial pad will automatically suggest numbers or names.
Even more features Check them out. Lower latency input for gamepad buttons and joysticks. Support for Bluetooth Smart Ready is already available on Nexus 7 and Nexus 4 devices and will be supported in a growing number of Android-compatible devices in the months ahead.
In addition to exposing playback controls on the remote devices connected over Bluetooth, apps can now transmit metadata such as track name, composer, and other types of media metadata.
A tablet owner can set up one or more restricted profiles in Settings and manage them independently.
Your app can offer restrictions to let owners manage your app content when it's running in a profile. With restricted profiles, tablet owners can quickly set up separate environments for each user, with the ability to manage finer-grained restrictions in the apps that are available in those environments.
Restricted profiles are ideal for friends and family, guest users, kiosks, point-of-sale devices, and more. Each restricted profile offers an isolated and secure space with its own local storage, home screens, widgets, and settings.
For developers, restricted profiles offer a new way to deliver more value and control to your users.
You can implement app restrictions — content or capabilities controls that are supported by your app — and advertise them to tablet owners in the profile configuration settings.
You can add app restrictions directly to the profile configuration settings using predefined boolean, select, and multi-select types.
If you want more flexibility, you can even launch your own UI from profile configuration settings to offer any type of restriction you want.
When your app runs in a profile, it can check for any restrictions configured by the owner and enforce them appropriately. For example, a media app might offer a restriction to let the owner set a maturity level for the profile.
At run time, the app could check for the maturity setting and then manage content according to the preferred maturity level. If your app is not designed for use in restricted profiles, you can opt out altogether, so that your app can't be enabled in any restricted profile.
Google Play services offers advanced location APIs that you can use in your apps. Hardware geofencing optimizes for power efficiency by performing location computation in the device hardware, rather than in software.
On devices that support hardware geofencing, Google Play services geofence APIs will be able to take advantage of this optimization to save battery while the device is moving.
Wi-Fi scan-only mode is a new platform optimization that lets users keep Wi-Fi scan on without connecting to a Wi-Fi network, to improve location accuracy while conserving battery.
Apps that depend on Wi-Fi for location services can now ask users to enable scan-only mode from Wi-Fi advanced settings. Wi-Fi scan-only mode is not dependent on device hardware and is available as part of the Android 4.
New sensor types allow apps to better manage sensor readings. Uncalibrated gyroscope and uncalibrated magnetometer sensors report raw measurements as well as estimated biases to apps.
The new hardware capabilities are already available on Nexus 7 and Nexus 4 devices, and any device manufacturer or chipset vendor can build them into their devices.
To meet the needs of the next generation of media services, Android 4. Through a combination of new APIs and enhancements to existing APIs, the media DRM framework provides an integrated set of services for managing licensing and provisioning, accessing low-level codecs, and decoding encrypted media data.
Apps using the media DRM framework manage the network communication with a license server and handle the streaming of encrypted data from a content library.
VP8 encoding support includes settings for target bitrate, rate control, frame rate, token partitioning, error resilience, reconstruction and loop filters.
The platform API introduces VP8 encoder support in a range of formats, so you can take advantage of the best format for your content.
VP8 encoding is available in software on all compatible devices running Android 4. For highest performance, the platform also supports hardware-accelerated VP8 encoding on capable devices.
Starting in Android 4. For example, you can now direct a stream from an OpenGL ES surface to the encoder, rather than having to copy between buffers.
Apps can use new media muxer APIs to combine elementary audio and video streams into a single output file. Since Android 4.
Notifications have long been a popular Android feature because they let users see information and updates from across the system, all in one place.
Now in Android 4. You can access notifications through new APIs that let you register a notification listener service and with permission of the user, receive notifications as they are displayed in the status bar.
Notifications are delivered to you in full, with all details on the originating app, the post time, the content view and style, and priority.
You can evaluate fields of interest in the notifications, process or add context from your app, and route them for display in any way you choose.
The new API gives you callbacks when a notification is added, updated, and removed either because the user dismissed it or the originating app withdrew it.
You'll be able to launch any intents attached to the notification or its actions, as well as dismiss it from the system, allowing your app to provide a complete user interface to notifications.
Users remain in control of which apps can receive notifications. At any time, they can look in Settings to see which apps have notification access and enable or disable access as needed.
Notification access is disabled by default — apps can use a new Intent to take the user directly to the Settings to enable the listener service after installation.
You can now create transparent overlays on top of Views and ViewGroups to render a temporary View hierarchy or transient animation effects without disturbing the underlying layout hierarchy.
Overlays are particularly useful when you want to create animations such as sliding a view outside of its container or dragging items on the screen without affecting the view hierarchy.
A new layout mode lets you manage the positioning of Views inside ViewGroups according to their optical bounds , rather than their clip bounds.
You can use the optical bounds layout mode to properly align widgets that use outer visual effects such as shadows and glows. Apps can now define the exit and entry animation types used on a window when the device is rotated.
You can set window properties to enable jump-cut , cross-fade , or standard window rotation. The system uses the custom animation types when the window is fullscreen and is not covered by other windows.
Apps can set new orientation modes for Activities to ensure that they are displayed in the proper orientation when the device is flipped.
Additionally, apps can use a new mode to lock the screen to its current orientation. This is useful for apps using the camera that want to disable rotation while shooting video.
Your app can listen for the intent and send the message to the caller over your messaging system. The intent includes the recipient caller as well as the message itself.
More parts of Android 4. More debugging information visible through the uiautomatorviewer tool. Pseudo-locales make it easier to test your app's localization.
To assist you with managing date formatting across locales, Android 4. To help you test your app more easily in other locales, Android 4.
Pseudo-locales simulate the language, script, and display characteristics associated with a locale or language group. Currently, you can test with a pseudo-locale for Accented English , which lets you see how your UI works with script accents and characters used in a variety of European languages.
The service receives the events and can process them as needed before they are passed to the system or other installed apps. Accessibility services can declare new capability attributes to describe what their services can do and what platform features they use.
For example, they can declare the capability to filter key events, retrieve window content, enable explore-by-touch, or enable web accessibility features.
In some cases, services must declare a capability attribute before they can access related platform features.
Building on the accessibility framework in Android 4. Through the UI automation framework you can perform basic operations, set rotation of the screen, generate input events, take screenshots, and much more.
Apps can now configure the Wi-Fi credentials they need for connections to WPA2 enterprise access points. Apps with permission to access and change Wi-Fi can configure authentication credentials for a variety of EAP and Phase 2 authentication methods.
This protects the operating system against potential security vulnerabilities. The KeyChain API now provides a method that allows applications to confirm that system-wide keys are bound to a hardware root of trust for the device.
This provides a place to create or store private keys that cannot be exported off the device, even in the event of a root or kernel compromise.
Using the APIs, apps can create or store private keys that cannot be seen or used by other apps , and can be added to the keystore without any user interaction.
The keystore provider provides the same security benefits that the KeyChain API provides for system-wide credentials, such as binding credentials to a device.
Private keys in the keystore cannot be exported off the device. This reduces root attack surface and likelihood of potential security vulnerabilities.
Systrace uses a new command syntax and lets you collect more types of profiling data. You can now collect trace data from hardware modules , kernel functions , Dalvik VM including garbage collection, resources loading , and more.
There's minimal impact on the performance of your app, so timings reported give you an accurate view of what your app is doing.
You can visualize app-specific events in a timeline in the Systrace output file and analyze the events in the context of other kernel and user space trace data.
Together with existing Systrace tags, custom app sections can give you new ways to understand the performance and behavior of your apps.
On-screen GPU profiling in Android 4. You can choose to display profiling data as on-screen bar or line graphs , with colors indicating time spent creating drawing commands blue , issuing the commands orange , and waiting for the commands to complete yellow.
The system updates the on-screen graphs continuously, displaying a graph for each visible Activity, including the navigation bar and notification bar.
If you see operations that cross the green line, you can analyze them further using Systrace and other tools. On devices running Android 4.
This new policy helps you catch and fix such cases. Improvements in the hardware-accelerated 2D renderer make common animations such as scrolling and swiping smoother and faster.
In particular, drawing is optimized for layers, clipping and certain shapes rounded rects, circles and ovals. A variety of WebView rendering optimizations make scrolling of web pages smoother and free from jitter and lags.
It automatically takes advantage of GPU computation resources whenever possible, dramatically improving performance for graphics and image processing.
Any app using Renderscript on a supported device can benefit immediately from this GPU integration without recompiling. All screen sizes now feature the status bar on top, with pull-down access to notifications and a new Quick Settings menu.
The familiar system bar appears on the bottom, with buttons easily accessible from either hand. The Application Tray is also available on all screen sizes.
Now several users can share a single Android tablet , with each user having convenient access to a dedicated user space.
Users can switch to their spaces with a single touch from the lock screen. On a multiuser device, Android gives each user a separate environment, including user-specific emulated SD card storage.
Users also have their own homescreens, widgets, accounts, settings, files, and apps, and the system keeps these separate.
All users share core system services, but the system ensures that each user's applications and data remain isolated.
In effect, each of the multiple users has their own Android device. Users can install and uninstall apps at any time in their own environments.
To save storage space, Google Play downloads an APK only if it's not already installed by another user on the device.
If the app is already installed, Google Play records the new user's installation in the usual way but doesn't download another copy of the app.
Multiple users can run the same copy of an APK because the system creates a new instance for each user, including a user-specific data directory.
For developers, multi-user support is transparent — your apps do not need to do anything special to run normally in a multi-user environment and there are no changes you need to make in your existing or published APKs.
The system manages your app in each user space just as it does in a single-user environment. You can extend app widgets to run on the lock screen, for instant access to your content.
In Android 4. Users can add as many as five lock screen widgets, choosing from widgets provided by installed apps.Aktuelle Informationen zum neuen Android Betriebssystem von Google (Jelly Bean). Stets die besten Nachrichten und Videos auf einen Blick. Viele Nutzer warten noch auf ICS. Andere dürfen sich schon auf das Jelly-Bean-Update freuen. Ob Ihr Gerät dabei ist, sehen Sie in dieser. Am Juni zeigte Google auf der hauseigenen Google I/O Android Jelly Bean. COMPUTER BILD stellt die besten Funktionen vor. Android mit dem Codenamen Jelly Bean ist das achte Update für Googles mobile Plattform. Das Update bringt für Smartphones und Tablets eine ganze. Wer eine ältere Android-Version als nutzt, muss mit Sicherheitslücken im Browser leben, die Google nicht schließen will – und selbst die.