Ведь ее появление прерывает работу анонимного браузера и не дает возможность полноценного выхода в Глобальную паутину. И наиболее распространенная ее причина — некорректные системные настройки. А если быть точнее — речь идет о неправильно выставленных времени и дате. Желательно включить их автоопределение по интернету. Антивирусы и прочий защитный софт на компьютере призваны для защиты системы от вирусов и других угроз. Но нередко они работают неправильно, определяя даже самые безобидные программы как опасные.
Если произошло что-то подобное, то Тор может либо начать неправильно работать, либо попросту не запускаться. Восстановить его будет довольно-таки затруднительно. Поэтому наш совет — выполните переустановку программы. Причем в случае с Windows достаточно удалить исходную папку, затем посетить страницу загрузки , скачать нужную версию браузера не забудьте выбрать правильный язык и произвести инсталляцию. Ваш адрес email не будет опубликован. Home FAQ Тор браузер не открывает страницы.
Для этого введите в строку браузера, где обычно отображается URL-адрес следующий путь: about:config , затем перейдите по нему и подтвердите свое согласие. Ну, а после отредактируйте настройки следующим образом:. Будьте осторожный, т. Однако переживать по этому поводу особо не стоит в силу того, что вы в любой момент можете вернуться на эту страницу и затем просто откатить все те изменения, которые в него внесли. К тому же, вы можете оставить комментарий в случае какого-либо форс-мажора и я соответственно постараюсь вам помочь.
Ну, а опция SSL Observatory анализирует сертификаты для определения скомпрометированных корневых центров сертификации. Дело в том, что на многих сайтах в Surface Web установлен SSL-сертификат, но редирект у них по тем или иным причинам не настроен. В связи с этим, определенные веб-ресурсы из-за данного расширения могут работать некорректно, а некоторые компоненты будут отображаться неправильно.
Ну, а как именно оно работает — я вам подробно объясню. Кроме того, для 3-х типов веб-ресурсов вы можете включить или отключить вышеописанные компоненты HTML-станиц с помощью расширения NoScript. За счет этого вам удастся не только обойти вредоносный контент, но и всевозможные счетчики на сайтах, собирающих данные о пользователях.
Вам уже должно быть известно, что браузер Тор — это на данный момент самый безопасный интернет-образователь, предоставляющий возможность анонимного серфинга сайтов в сети. Однако нужно понимать, что Интернет — это одновременно полезная, но в то же время и очень опасная штука. В связи с этим, в Интернете как в Surface Web, так и в DarkNet нужно соблюдать ряд определенных правил для того, чтобы оставаться анонимным. Да и вы сами наверное понимаете, что одно только Tor Browser для этого недостаточно.
Перед выходом в DarkNet залепите свою камеру, отключите микрофон и динамик на вашем устройстве, и поверьте, это не шутка. Ни под каким предлогом не отключаете расширение NoScript в браузере Тор, т. Ничего не скачивайте с onion-сайтов, если не хотите, чтобы ваше устройство было взломано, а данные кредитных карт похищены.
Вообще, самый безопасный вариант использовать Тор — это установить Tails OS на флешку и тогда никакой хакер или даже сотрудник правоохранительных органов попросту не сможет получить доступ к данной операционной системе. Ну, а пароли можно хранить в зашифрованном разделе с помощью VeraCrypt.
Ну и вы могли подумать, что я чокнулся на анонимности, однако это не так. К сожалению, обо всех типах вирусов и методах атак я не могу рассказать в одной статье, поэтому не стоит относиться к моим рекомендациям скептически. Ох, если бы мне кто-нибудь раньше объяснил о пользе мостов и прокси, я бы все равно их не использовал.
Однако для кого-то из вас настройка тех же программ, что и у меня может оказаться трудной задачей, поэтому их мы рассматривать не будем. В связи с этим, давайте для начала по быстрому настроим мосты в браузере Тор, на реальном примере подберем какой-нибудь бесплатный прокси-сервер, а также настроим его и затем протестируем с помощью какого-нибудь сервиса. Мосты — это анонимные узлы, скрывающие от провайдера информацию о хосте.
Проще говоря, если провайдер заблокировал IP того или иного сайта, на который вы хотите зайти — нужно включить мосты. Они вводят в заблуждение фильтры провайдера, а тот в свою очередь пропускает необходимые узлы. И у вас соответственно созрел логически вопрос, — где взять эти мосты и куда их нужно вводить? Несмотря на обилие интересующих нас онлайн-сервисов, мы будем использовать исключительно официальные ресурсы Tor Project.
После этого перед вами появится IP-адреса и ключи, которые нужно скопировать в буфер обмена. Обратите внимание на то, что мосты нужно только вставить и Tor Browser сам все сохранит, поэтому не стоит искать в недоумение ту самую волшебную кнопку. Ну, а для того, чтобы изменения вступили в силу — вам нужно сначала закрыть браузер, а потом снова его запустить и тогда все будет корректно работать.
Думаю с тем, как настроить мосты в Tor Browser мы уже разобрались и всем все понятно, поэтому далее переходим к второй части данного раздела. Сейчас я расскажу вам о том, как правильно настроить прокси в браузере Тор. Прокси — промежуточный сервер, выполняющий роль посредника между пользователем и целевым сервером. С помощью прокси можно как выполнять, так и принимать запросы к сетевым службам, а также получать ответы.
То есть, все запросы к сайтам выполняются не через ваш IP, а через IP сервера. Типы прокси бывают разные, как и протоколы, с которыми они могут работать. Однако сейчас на этом можете не заморачиваться , т. Перейдите по этой ссылке , затем выберите тип прокси, желательно SOCKS5, а также страну, в которой расположены сервера и уровень защиты. Ну и не забудьте прописать IP-адрес и порт. Но сразу предупреждаю, я не уверен, что этим бесплатным прокси-серверам можно доверять.
По этой причине рекомендую использовать VPN , а не прокси, но если вам нужен именно прокси, то тогда лучше приобретите какой-нибудь VPS.
These cookies do not store any personal information. Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website. NoScript temporarily disabled in Tor Browser … how to fix it? Share this The passion for writing and a strong belief that security is founded on sharing and awareness led Pierluigi to find the security blog "Security Affairs" recently named a Top National Security Resource for US.
We also use third-party cookies that help us analyze and understand how you use this website. In other browsers, this can be accomplished by DOM events on image or script tags. This open vs filtered vs closed port list can provide a very unique fingerprint of a machine, because it essentially enables the detection of many different popular third party applications and optional system services Skype, Bitcoin, Bittorrent and other P2P software, SSH ports, SMB and related LAN services, CUPS and printer daemon config ports, mail servers, and so on.
In Tor Browser, we prevent access to The local Tor client then rejects them, since it is configured to proxy for internal IP addresses by default. Access to the local network is forbidden via the same mechanism.
We also disable the WebRTC API as mentioned previously, since even if it were usable over Tor, it still currently provides the local IP address and associated network information to websites. However, because it is not clear if certain carefully-crafted error conditions in these protocols could cause them to reveal machine information and still fail silently prior to the password prompt, these authentication mechanisms should either be disabled, or placed behind a site permission before their use.
We simply disable them with a patch. The GamePad API provides web pages with the USB device id, product id, and driver name of all connected game controllers, as well as detailed information about their capabilities. For systems without a game controller, a standard controller can be virtualized through the keyboard, which will serve to both improve usability by normalizing user interaction with different games, as well as eliminate fingerprinting vectors.
For now though, we simply disable it via the pref dom. According to the Panopticlick study, fonts provide the most linkability when they are available as an enumerable list in file system order, via either the Flash or Java plugins. With a large enough pre-built list to query, a large amount of fingerprintable information may still be available, especially given that additional fonts often end up installed by third party software and for multilingual support. Implementation Status: We investigated shipping a predefined set of fonts to all of our users allowing only those fonts to be used by websites at the exclusion of system fonts.
We are currently following this approach, which has been suggested by researchers previously. This defense is available for all three supported platforms: Windows, macOS, and Linux, although the implementations vary in detail. For Windows and macOS we use a preference, font. The whitelist for Windows and macOS contains both a set of Noto fonts which we bundle and fonts provided by the operating system. For Linux systems we only bundle fonts and deploy a fonts. In addition to that we set the font.
This is not guaranteed even if we bundle all the fonts Tor Browser uses as it can happen that fonts are loaded in a different order on different systems. Setting the above mentioned preferences works around this issue by specifying the font to use explicitly. Allowing fonts provided by the operating system for Windows and macOS users is currently a compromise between fingerprintability resistance and usability concerns.
Since many aspects of desktop widget positioning and size are user configurable, these properties yield customized information about the computer, even beyond the monitor size. Design Goal: Our design goal here is to reduce the resolution information down to the bare minimum required for properly rendering inside a content window. We intend to report all rendering information correctly with respect to the size and properties of the content window, but report an effective size of 0 for all border material, and also report that the desktop is only as big as the inner content window.
As an alternative to zoom-based solutions we are testing a different approach in our alpha series that tries to round the browser window at all times to a multiple x pixels. Regardless which solution we finally pick, until it will be available the user should also be informed that maximizing their windows can lead to fingerprintability under the current scheme. Implementation Status: We automatically resize new browser windows to a x pixel multiple based on desktop resolution by backporting patches from bug and setting privacy.
To minimize the effect of the long tail of large monitor sizes, we also cap the window size at pixels in each direction. In addition to that we set privacy. Similarly, we use that preference to return content window relative points for DOM events. We also force popups to open in new tabs via browser. In addition, we prevent auto-maximizing on browser start, and inform users that maximized windows are detrimental to privacy in this mode.
WebGL is fingerprintable both through information that is exposed about the underlying driver and optimizations, as well as through performance fingerprinting. Because of the large amount of potential fingerprinting vectors and the previously unexposed vulnerability surface , we deploy a similar strategy against WebGL as for plugins. First, WebGL Canvases have click-to-play placeholders provided by NoScript , and do not run until authorized by the user. Second, we obfuscate driver information by setting the Firefox preferences webgl.
Furthermore, WebGL2 is disabled by setting webgl. To make the minimal WebGL mode usable we additionally normalize its properties with a Firefox patch. Another option for WebGL might be to use software-only rendering, using a library such as Mesa.
The use of such a library would avoid hardware-specific rendering differences. The MediaDevices API provides access to connected media input devices like cameras and microphones, as well as screen sharing. In particular, it allows web content to easily enumerate those devices with MediaDevices. Nevertheless, we disable this feature for now as a defense-in-depth by setting media.
In order to prevent that we implemented two defenses: first we disable the Touch API by setting dom. Second, for those user that really need or want to have this API available we patched the code to give content-window related coordinates back. Furthermore, we made sure that the touch area described by Touch. That is achieved by a direct Firefox patch which reports back 1 for the first two properties and 0. From Firefox 52 on it is disabled for web content. Initially, it was possible on Linux to get a double-precision floating point value for the charge level, which means there was a large number of possible values making it almost behave like an identifier allowing to track a user cross-origin.
But still after that got fixed and on other platforms where the precision was just two significant digits anyway the risk for tracking users remained as combined with the chargingTime and dischargingTime the possible values got estimated to be in the millions under normal conditions. It is possible to get the system uptime of a Tor Browser user by querying the Event.
We avoid this by setting dom. This might seem to be counterintuitive at first glance but the effect of setting that preference to true is a normalization of evt. Together with clamping the timer resolution to ms this provides an effective means against system uptime fingerprinting.
KeyboardEvent s provide a way for a website to find out information about the keyboard layout of its visitors. In fact there are several dimensions to this fingerprinting vector. The KeyboardEvent. On the other hand the KeyboardEvent. This is dependent on things like keyboard layout, locale and modifier keys. Characters from non-en-US languages are currently returning an empty KeyboardEvent.
Moreover, neither Alt or Shift , or AltGr keyboard events are reported to content. We are currently not taking the actually deployed browser locale or the locale indicated by a loaded document into account when spoofing the keyboard layout. We think that would be the right thing to do in the longer run, to mitigate possible usability issues and broken functionality on websites. Similarily to how users of non-english Tor Browser bundles right now can choose between keeping the Accept header spoofed or not they would then be able to keep a spoofed english keyboard or a spoofed one depending on the actual Tor Browser locale or language of the document.
We omit the Firefox minor revision, and report a popular Windows platform. If the software is kept up to date, these headers should remain identical across the population even when updated. Implementation Status: Firefox provides several options for controlling the browser user agent string which we leverage. We also set similar prefs for controlling the Accept-Language and Accept-Charset headers, which we spoof to English by default. Additionally, we remove content script access to Components.
Attacks based on timing side channels are nothing new in the browser context. Cache-based , cross-site timing , and pixel stealing , to name just a few, got investigated in the past. While their fingerprinting potential varies all timing-based attacks have in common that they need sufficiently fine-grained clocks.
Implementation Status: The cleanest solution to timing-based side channels would be to get rid of them. This has been proposed in the research community. However, we remain skeptical as it does not seem to be trivial even considering just a single side channel and more and more potential side channels are showing up. Thus, we rely on disabling all possible timing sources or making them coarse-grained enough in order to render timing side channels unsuitable as a means for fingerprinting browser users.
We set dom. Furthermore, we clamp the resolution of explicit clocks to ms with two Firefox patches. This includes performance. While clamping the clock resolution to ms is a step towards mitigating timing-based side channel fingerprinting, it is by no means sufficient. It turns out that it is possible to subvert our clamping of explicit clocks by using implicit ones , e. We are tracking this problem in our bug tracker and are working with the research community and Mozilla to develop and test a proper solution to this part of our defense against timing-based side channel fingerprinting risks.
Due to bugs in Firefox it is possible to detect the locale and the platform of a Tor Browser user. Moreover, it is possible to find out the extensions a user has installed. This does not happen if the extension is indeed installed but the resource path does not exist. There are more than a dozen Firefox resources do not aid in fingerprinting Tor Browser users as they are not different on the platforms and in the locales we support.
As long as these attacks take several seconds or more to execute, they are unlikely to be appealing to advertisers, and are also very likely to be noticed if deployed against a large number of people. Implementation Status: Currently, our mitigation against performance fingerprinting is to disable Navigation Timing by setting the Firefox preference dom.
Implementation Status: We set dom. However, there are probabilistic ways of determining the same information available which we are not defending against currently. Moreover, we might even want to think about a more elaborate approach defending against this fingerprinting technique by not making all users uniform but rather by following a bucket approach as we currently do in our defense against screen size exfiltration.
However, there are more bits of information that the Web Audio API reveals if audio signals generated with an OscillatorNode are processed as hardware and software differences influence those results. That has the positive side effect that it disables one of several means to perform ultrasound cross-device tracking as well, which is based on having AudioContext available. The MediaError object allows the user agent to report errors that occurred while handling media, for instance using audio or video elements.
The message property provides specific diagnostic information to help understanding the error condition. As a defense-in-depth we make sure that no information aiding in fingerprinting is leaking to websites that way by returning just an empty string. It is possible to monitor the connection state of a browser over time with navigator. We prevent this by setting network.
Reader View is a Firefox feature to view web pages clutter-free and easily adjusted to own needs and preferences. To avoid fingerprintability risks we make Tor Browser users uniform by setting reader. This makes sure that documents are not parsed on load as this is disabled on some devices due to memory consumption and we pretend that everybody has already been using that feature in the past.
Tor Browser is based on Firefox which is a Mozilla product. Quite naturally, Mozilla is interested in making users aware of new features and in gathering information to learn about the most pressing needs Firefox users are facing. This is often implemented by contacting Mozilla services, be it for displaying further information about a new feature or by sending aggregated data back for analysis.
While some of those mechanisms are disabled by default on release channels such as telemetry data others are not. We make sure that none of those Mozilla services are contacted to avoid possible fingerprinting risks. In particular, we disable GeoIP-based search results by setting browser. Furthermore, we disable Selfsupport and Unified Telemetry by setting browser. The same is done with datareporting. Additionally, we disable the UITour backend by setting browser. On the update side we prevent the browser from pinging the new Kinto service for blocklist updates as it is not used for it yet anyway.
This is done by setting services. The captive portal detection code is disabled as well as it phones home to Mozilla. We set network. Unrelated to that we make sure that Mozilla does not get bothered with TLS error reports from Tor Browser users by hiding the respective checkbox with security. AddonManager API. We have Safebrowsing disabled in Tor Browser. In order to avoid pinging providers for list updates we remove the entries for browser.
As we mentioned in the introduction of this section, OS type fingerprinting is currently considered a lower priority, due simply to the numerous ways that characteristics of the operating system type may leak into content, and the comparatively low contribution of OS to overall entropy. In particular, there are likely to be many ways to measure the differences in widget size, scrollbar size, and other rendered details on a page. Also, directly exported OS routines such as those from the standard C math library expose differences in their implementations through their return values.
Design Goal: We intend to reduce or eliminate OS type fingerprinting to the best extent possible, but recognize that the effort for reward on this item is not as high as other areas. The entropy on the current OS distribution is somewhere around 2 bits, which is much lower than other vectors which can also be used to fingerprint configuration and user-specific information. We disable these APIs through the Firefox preferences dom. For more details on fingerprinting bugs and enhancements, see the tbb-fingerprinting tag in our bug tracker.
Finally, a fresh browser window is opened, and the current browser window is closed this does not spawn a new Firefox process, only a new window. Upon the close of the final window, an unload handler is fired to invoke the garbage collector , which has the effect of immediately purging any blob:UUID URLs that were created by website content via URL. In addition to the above mechanisms that are devoted to preserving privacy while browsing, we also have a number of technical mechanisms to address other privacy and security issues.
In order to provide vulnerability surface reduction for users that need high security, we have implemented a "Security Slider" to allow users to make a tradeoff between usability and security while minimizing the total number of choices to reduce fingerprinting.
At this security level, the preferences are the Tor Browser defaults. This includes three features that were formerly governed by the slider at higher security levels: gfx. Even though Mozilla reverted that decision after another round of fixing critical Graphite bugs, we remain skeptical and keep that feature disabled for now.
While Mozilla is working on getting this disabled again we take the protective stance already now and block remote JAR files even on the low security level. Finally, we exempt asm. See the Disk Avoidance and the cache linkability concerns in the Cross-Origin Identifier Unlinkability sections for further details. This security level inherits the preferences from the Medium level, and additionally disables remote fonts noscript.
Website Traffic Fingerprinting is a statistical attack to attempt to recognize specific encrypted website activity. We want to deploy a mechanism that reduces the accuracy of useful features available for classification. This mechanism would either impact the true and false positive accuracy rates, or reduce the number of web pages that could be classified at a given accuracy rate.
Ideally, this mechanism would be as light-weight as possible, and would be tunable in terms of overhead. We suspect that it may even be possible to deploy a mechanism that reduces feature extraction resolution without any network overhead. It may be also possible to tune such defenses such that they only use existing spare Guard bandwidth capacity in the Tor network, making them also effectively no-overhead. Currently, we patch Firefox to randomize pipeline order and depth.
Unfortunately, pipelining is very fragile. Many sites do not support it, and even sites that advertise support for pipelining may simply return error codes for successive requests, effectively forcing the browser into non-pipelined behavior. Firefox also has code to back off and reduce or eliminate the pipeline if this happens.
It turns out that we could actually deploy exit-side proxies that allow us to use SPDY from the client to the exit node. This would make our defense not only free, but one that actually improves performance. Knowing this, we created this defense as an experimental research prototype to help evaluate what could be done in the best case with full server support. Unfortunately, the bias in favor of compelling attack papers has caused academia to ignore this request thus far, instead publishing only cursory yet "devastating" evaluations that fail to provide even simple statistics such as the rates of actual pipeline utilization during their evaluations, in addition to the other shortcomings and shortcuts mentioned earlier.
We can accept that our defense might fail to work as well as others in fact we expect it , but unfortunately the very same shortcuts that provide excellent attack results also allow the conclusion that all defenses are broken forever. So sadly, we are still left in the dark on this point. In order to inform the user when their Tor Browser is out of date, we perform a privacy-preserving update check asynchronously in the background.
If the value from our preference is present in the recommended version list, the check is considered to have succeeded and the user is up to date. If not, it is considered to have failed and an update is needed. The check is triggered upon browser launch, new window, and new tab, but is rate limited so as to happen no more frequently than once every 1. If the check fails, we cache this fact, and update the Torbutton graphic to display a flashing warning icon and insert a menu option that provides a link to our download page.
Additionally, we reset the value for the browser homepage to point to a page that informs the user that their browser is out of date. We also make use of the in-browser Mozilla updater, and have patched the updater to avoid sending OS and Kernel version information as part of its update pings. In the age of state-sponsored malware, we believe it is impossible to expect to keep a single build machine or software signing key secure, given the class of adversaries that Tor has to contend with.
For this reason, we have deployed a build system that allows anyone to use our source code to reproduce byte-for-byte identical binary packages to the ones that we distribute. The GNU toolchain has been working on providing reproducible builds for some time, however a large software project such as Firefox typically ends up embedding a large number of details about the machine it was built on, both intentionally and inadvertently. Additionally, manual changes to the build machine configuration can accumulate over time and are difficult for others to replicate externally, which leads to difficulties with binary reproducibility.
For this reason, we decided to leverage the work done by the Gitian Project from the Bitcoin community. This document is used to install a qemu-kvm image, and execute your build scriptlet inside it. We have created a set of wrapper scripts around Gitian to automate dependency download and authentication, as well as transfer intermediate build outputs between the stages of the build process. Because Gitian creates a Linux build environment, we must use cross-compilation to create packages for Windows and macOS.
For Windows, we use mingw-w64 as our cross compiler. On top of what Gitian provides, we also had to address the following additional sources of non-determinism:. Many file archivers walk the file system in inode structure order by default, which will result in ordering differences between two different archive invocations, especially on machines of different disk and hardware configurations. The fix for this is to perform an additional sorting step on the input list for archives, but care must be taken to instruct libc and other sorting routines to use a fixed locale to determine lexicographic ordering, or machines with different locale settings will produce different sort results.
We created wrapper scripts for tar , zip , and DMG to aid in reproducible archive creation. We ran into difficulties with both binutils and the DMG archive script using uninitialized memory in certain data structures that ended up written to disk. Our binutils fixes were merged upstream, but the DMG archive fix remains an independent patch. The standard way of controlling timestamps in Gitian is to use libfaketime, which hooks time-related library calls to provide a fixed timestamp.
However, due to our use of wine to run py2exe for python-based pluggable transports, pyc timestamps had to be addressed with an additional helper script. In two circumstances, deliberately generated entropy was introduced in various components of the build process. First, the BuildID Debuginfo identifier which associates detached debug files with their corresponding stripped executables was introducing entropy from some unknown source.
Second, on Linux, Firefox builds detached signatures of its cryptographic libraries using a temporary key for FIPS certification. A rather insane subsection of the FIPS certification standard requires that you distribute signatures for all of your cryptographic libraries. The Firefox build process meets this requirement by generating a temporary key, using it to sign the libraries, and discarding the private portion of that key. Because there are many other ways to intercept the crypto outside of modifying the actual DLL images, we opted to simply remove these signature files from distribution.
There simply is no way to verify code integrity on a running system without both OS and co-processor assistance. Download package signatures make sense of course, but we handle those another way as mentioned above. Gitian provides an option to use LXC containers instead of full qemu-kvm virtualization.
Unfortunately, these containers can allow additional details about the host OS to leak. In particular, umask settings as well as the hostname and Linux kernel version can leak from the host OS into the LXC container. We addressed umask by setting it explicitly in our Gitian descriptor scriptlet, and addressed the hostname and kernel version leaks by directly patching the aspects of the Firefox build process that included this information into the build.
It also turns out that some libraries in particular: libgmp attempt to detect the current CPU to determine which optimizations to compile in. The build process generates a single shasums-unsigned-build. The build scripts have an optional matching step that downloads these signatures, verifies them, and ensures that the local builds match this file.
When builds are published officially, the single shasums-unsigned-build. The packages are additionally signed with detached GPG signatures from an official signing key. The fact that the entire set of packages for a given version can be authenticated by a single hash of the shasums-unsigned-build.
Interesting examples include providing multiple independent cryptographic signatures for packages, listing the package hashes in the Tor consensus, and encoding the package hashes in the Bitcoin blockchain. The Windows releases are also signed by a hardware token provided by Digicert. In order to verify package integrity, the signature must be stripped off using the osslsigncode tool, as described on the Signature Verification page.
Due to the fact that bit-identical packages can be produced by anyone, the security of this build system extends beyond the security of the official build machines. In fact, it is still possible for build integrity to be achieved even if all official build machines are compromised. By default, all tor-specific dependencies and inputs to the build process are downloaded over Tor, which allows build verifiers to remain anonymous and hidden.
Because of this, any individual can use our anonymity network to privately download our source code, verify it against public, signed, audited, and mirrored git repositories, and reproduce our builds exactly, without being subject to targeted attacks.
We make use of the Firefox updater in order to provide automatic updates to users. We make use of certificate pinning to ensure that update checks cannot be tampered with by setting security. The Firefox updater also has code to ensure that it can reliably access the update server to prevent availability attacks, and complains to the user after 48 hours go by without a successful response from the server. The privacy properties of Tor Browser are based upon the assumption that link-click navigation indicates user consent to tracking between the linking site and the destination site.
While this definition is sufficient to allow us to eliminate cross-site third party tracking with only minimal site breakage, it is our long-term goal to further reduce cross-origin click navigation tracking to mechanisms that are detectable by attentive users, so they can alert the general public if cross-origin click navigation tracking is happening where it should not be. In an ideal world, the mechanisms of tracking that can be employed during a link click would be limited to the contents of URL parameters and other properties that are fully visible to the user before they click.
However, the entrenched nature of certain archaic web features make it impossible for us to achieve this transparency goal by ourselves without substantial site breakage. So, instead we maintain a Deprecation Wishlist of archaic web technologies that are currently being ab used to facilitate federated login and other legitimate click-driven cross-domain activity but that can one day be replaced with more privacy friendly, auditable alternatives.
Because the total elimination of side channels during cross-origin navigation will undoubtedly break federated login as well as destroy ad revenue, we also describe auditable alternatives and promising web draft standards that would preserve this functionality while still providing transparency when tracking is occurring.
When leaving a. That avoids leaking information which might be especially problematic in the case of transitioning from a. In fact, a great deal of personal data is inadvertently leaked to third parties through the source URL parameters. We believe the Referer header should be made explicit, and believe that Referrer Policy, which is available since Firefox 52, provides a decent step in this direction.
If a site wishes to transmit its URL to third party content elements during load or during link-click, it should have to specify this as a property of the associated HTML tag or in an HTTP response header. With an explicit property or response header, it would then be possible for the user agent to inform the user if they are about to click on a link that will transmit Referer information perhaps through something as subtle as a different color in the lower toolbar for the destination URL.
This same UI notification can also be used for links with the "ping" attribute. It is possible to utilize this property for identifier storage during click navigation. This is sometimes used for additional CSRF protection and federated login. This functionality is deceptive and is frequently a vector for malware and phishing attacks.
Unfortunately, many legitimate sites also employ such transparent link rewriting, and blanket disabling this functionality ourselves will simply cause Tor Browser to fail to navigate properly on these sites. Automated cross-origin redirects are one form of this behavior that is possible for us to address ourselves , as they are comparatively rare and can be handled with site permissions.
Web-Send is a browser-based link sharing and federated login widget that is designed to operate without relying on third-party tracking or abusing other cross-origin link-click side channels. It has a compelling list of privacy and security features , especially if used as a "Like button" replacement.
While it does not directly provide the link sharing capabilities that Web-Send does, it is a better solution to the privacy issues associated with federated login than Web-Send is. Table of Contents 1. Introduction 1. Browser Component Overview 2. Design Requirements and Philosophy 2. Security Requirements 2. Privacy Requirements 2. Philosophy 3. Adversary Model 3. Adversary Goals 3. Adversary Capabilities - Positioning 3. Adversary Capabilities - Attacks 4.
Implementation 4. Proxy Obedience 4. State Separation 4. Disk Avoidance 4. Application Data Isolation 4. Cross-Origin Identifier Unlinkability 4. Cross-Origin Fingerprinting Unlinkability 4. Long-Term Unlinkability via "New Identity" button 4. Other Security Measures 5. Build Security and Package Integrity 5. Achieving Binary Reproducibility 5. Package Signatures and Verification 5. Anonymous Verification 5. Update Safety A. Towards Transparency in Navigation Tracking A.
Deprecation Wishlist A. Promising Standards. Browser Component Overview. Design Requirements and Philosophy. Security Requirements. Disk Avoidance The browser MUST NOT write any information that is derived from or that reveals browsing activity to the disk, or store it in memory beyond the duration of one browsing session, unless the user has explicitly opted to store their browsing history information to disk. Privacy Requirements.
Long-Term Unlinkability The browser MUST provide an obvious, easy way for the user to remove all of its authentication tokens and browser state and obtain a fresh identity. Preserve existing user model The existing way that the user expects to use a browser must be preserved. Favor the implementation mechanism least likely to break sites In general, we try to find solutions to privacy issues that will not induce site breakage, though this is not always possible. Plugins must be restricted Even if plugins always properly used the browser proxy settings which none of them do and could not be induced to bypass them which all of them can , the activities of closed-source plugins are very difficult to audit and control.
Stay Current We believe that if we do not stay current with the support of new web technologies, we cannot hope to substantially influence or be involved in their proper deployment or privacy realization. Adversary Model. Adversary Goals. Correlation of Tor vs Non-Tor Activity If direct proxy bypass is not possible, the adversary will likely happily settle for the ability to correlate something a user did via Tor with their non-Tor activity. Correlate activity across multiple sites The primary goal of the advertising networks is to know that the user who visited siteX.
History records and other on-disk information In some cases, the adversary may opt for a heavy-handed approach, such as seizing the computers of all Tor users in an area especially after narrowing the field by the above two pieces of information. Adversary Capabilities - Positioning. Exit Node or Upstream Router The adversary can run exit nodes, or alternatively, they may control routers upstream of exit nodes.
Physical Access Some users face adversaries with intermittent or constant physical access. Adversary Capabilities - Attacks. Read and insert identifiers The browser contains multiple facilities for storing identifiers that the adversary creates for the purposes of tracking users. Fingerprint users based on browser attributes There is an absurd amount of information available to websites via attributes of the browser.
Proxy Obedience. Disabling plugins Plugins, like Flash, have the ability to make arbitrary OS system calls and bypass proxy settings. External App Blocking and Drag Event Filtering External apps can be induced to load files that perform network activity. Disabling system extensions and clearing the addon whitelist Firefox addons can perform arbitrary activity on your computer, including bypassing Tor.
State Separation. Disk Avoidance. Design Goal:. Implementation Status:. We are working towards this goal through several mechanisms. First, we set the Firefox Private Browsing preference browser. We also had to disable the media cache with the pref media.
Finally, we set security. As an additional defense-in-depth measure, we set browser. Many of these preferences are likely redundant with browser. For more details on disk leak bugs and enhancements, see the tbb-disk-leak tag in our bugtracker.
Application Data Isolation. Cross-Origin Identifier Unlinkability. Improving the Privacy UI. This example UI is a mock-up of how isolating identifiers to the URL bar domain can simplify the privacy UI for all data - not just cookies. Once browser identifiers and site permissions operate on a URL bar basis, the same privacy window can represent browsing history, DOM Storage, HTTP Auth, search form history, login values, and so on within a context menu for each site.
Favicons Design Goal: When visiting a website its favicon is fetched via a request originating from the browser itself similar to the OCSP mechanism mentioned in the previous section. Speculative and prefetched connections Firefox provides the feature to connect speculatively to remote hosts if that is either indicated in the HTML file e. Cross-Origin Fingerprinting Unlinkability. Sources of Fingerprinting Issues. End-user Configuration Details End-user configuration details are by far the most severe threat to fingerprinting, as they will quickly provide enough information to uniquely identify a user.
Device and Hardware Characteristics Device and hardware characteristics can be determined in three ways: they can be reported explicitly by the browser, they can be inferred through browser functionality, or they can be extracted through statistical measurements of system performance. Operating System Vendor and Version Differences Operating system vendor and version differences permeate many different aspects of the browser. User Behavior While somewhat outside the scope of browser fingerprinting, for completeness it is important to mention that users themselves theoretically might be fingerprinted through their behavior while interacting with a website.
Browser Vendor and Version Differences Due to vast differences in feature set and implementation behavior even between different minor versions of the same browser, browser vendor and version differences are simply not possible to conceal in any realistic way. General Fingerprinting Defenses.
Subsystem Modification or Reimplementation In cases where simple spoofing is not enough to properly conceal underlying device characteristics or operating system details, the underlying subsystem that provides the functionality for a feature or API may need to be modified or completely reimplemented. Virtualization Virtualization is needed when simply reimplementing a feature in a different way is insufficient to fully conceal the underlying behavior.