BRB Risk Jobs Board — Conflicts Attorney (Fisher Phillips)


This week, I’m pleased to highlight an open role at Fisher Phillips, the job description is below, to apply: visit their job application portal.

  • Fisher Phillips, a premier international labor and employment law firm, is seeking an experienced Conflicts Attorney to join our New Business Intake team. This role is ideal for candidates with direct law firm conflicts experience—professionals who regularly analyze conflict reports, apply ethical rules, coordinate resolutions, and collaborate with Risk Management, General Counsel, or Intake teams.
  • In this high-impact position, you will be responsible for evaluating and resolving potential conflicts involving new business, lateral hires, and RFPs. You’ll work closely with the General Counsel’s Office and attorneys across the firm, making informed decisions that directly protect the firm and its clients.
  • We’re looking for a detail-driven conflicts professional with 2+ years of hands-on conflicts analysis in a law firm environment (such as Conflicts Attorney, Conflicts Analyst, Risk Management Attorney, or Ethics/Professional Responsibility role). Candidates without prior conflicts experience will not be considered for this position.
  • If you have a strong command of conflicts rules, experience with conflicts databases, and a passion for safeguarding ethical compliance, we encourage you to apply.
  • This is a full-time, fully remote position. Applicants based on the West Coast—or those willing to work West Coast business hours—are welcome to apply.
  • Please note: This role is not an entry point into practice. Applicants must have substantive conflicts or law-firm risk-management experience.


Key Responsibilities

  • Analyze complex conflict reports and exercise independent judgment to identify potential conflict of interest issues with new business, legal hires, and requests for proposal.
  • Conduct research to gather information or clarification on potential issues, including the appropriate jurisdictional conflict and/or ethical rules and opinions to assist in determining specific conflict resolution strategies.
  • Collaborate with attorneys, paralegals, and support staff to gather necessary information for conflict analysis.
  • Take initiative to analyze and resolve conflicts independently.
  • Prepare clear and concise communications to attorneys, identifying all potential issues found in the conflict report in order to provide recommendations to attorneys and facilitate conflict resolution.
  • Request and maintain all necessary supporting documentation to clear actual or potential conflict issues.
  • Draft consents and conflicts waivers, ensuring compliance with legal and ethical standards.
  • Provide assistance in managing client guidelines related to conflicts of interest, confidentiality, and ethical obligations.
  • Prepare and implement ethical walls.
  • Assist in developing and implementing conflicts policies and procedures.
  • Provide guidance to New Business Intake Analysts with the goal of ensuring accuracy and consistency in preparation and analysis of conflict reports.
  • Assist in training of New Business Intake staff and Firm administrative staff.
  • Maintain knowledge of trends and developments involving legal and ethical rules related to conflicts of interest, confidentiality, and professional responsibility.
  • Coordinate with the conflicts team to update and maintain the conflicts database and ensure accurate conflict reporting.
  • Respond promptly to conflicts-related inquiries from attorneys and staff.
  • Maintain strict confidentiality and handle sensitive information with the utmost discretion.


Qualifications

  • Juris Doctor (J.D.) degree from an accredited law school.
  • Active membership in good standing with the bar association of the relevant jurisdiction.
  • Minimum of 2 years of hands-on conflicts experience in a law firm environment (e.g., Conflicts Attorney, Conflicts Analyst, Risk Management Attorney, or Professional Responsibility role).
  • In-depth knowledge of conflicts of interest rules, legal ethics, and professional responsibility.
  • Strong analytical and problem-solving skills with the ability to assess complex legal scenarios.
  • Excellent attention to detail and exceptional organizational skills.
  • Outstanding written and verbal communication skills.
  • Ability to handle multiple priorities and work under tight deadlines.
  • Proficiency in using Intapp Open and Intapp Walls software and other relevant legal technology tools.
  • Demonstrated ability to work independently as well as collaboratively in a team-oriented environment.
  • High level of professionalism, integrity, and ethical conduct.

 

Equal Opportunity / FCA statement
Qualified applications with arrest or conviction records will be considered for employment in accordance with both the FCO and the California Fair Chance Act (FCA).


Equal Opportunity Employer

Fisher Phillips is committed to providing equal employment opportunities to all employees and applicants, regardless of race, ethnicity, religion, sex (including related medical conditions), gender, sexual orientation, national origin, citizenship status, veteran status, marital status, pregnancy, age, disability, or any other protected status, in compliance with all applicable laws.


Compensation

The salary range for this position is $120,000 – $160,000. Actual base pay within this range will be determined by several components, including but not limited to, location, relevant experience, internal equity, skills, qualifications, and other job-related factors permitted by law.


Why Join Us

At Fisher Phillips, exceptional talent is the foundation of our success. Joining our team means collaborating in a professional, dynamic environment leveraging cutting-edge technology. Our leadership fosters professional growth and provides opportunities to challenge yourself.

Our comprehensive benefits include health, dental, and vision insurance, a 401(k) with profit sharing, 18 days of vacation, accrue 10 sick days each calendar year and 10 paid holidays per benefit year. Wellness programs and 24/7 telehealth services support your overall well-being. Visit www.fisherphillips.com to learn more.

 

To apply: visit the Fisher Phillips job application portal.

 

And if you’re interested in seeing your firm’s listings here, please feel free to reach out



Source link

Leave a Reply

Subscribe to Our Newsletter

Get our latest articles delivered straight to your inbox. No spam, we promise.

Recent Reviews


The internet is changing and so is the way we search and find information. The trick behind all the search queries is nothing but a web crawler.

Yes, the machine that searches the web, retrieves data, and assists search engines such as Google in sorting the information into searchable indexes. Search engines would be nothing without crawlers. But do you know there are different types of crawlers lately?

Well, traditional crawlers like Googlebot have been using rule-based systems over the years to retrieve information and sift through links and draw results to user queries. This method is still effective, although there are a few limitations it comes with.

Let’s now introduce the new age of AI-powered crawlers, a next-generation genus of bots, based on artificial intelligence and machine learning. These crawlers do not just search the sites; they comprehend the sites. Through semantics, tone and context, they are going above and beyond in the web searching landscape.

Here in this blog, we are going to discuss the differences between traditional and AI crawlers, alongside how they will transform search in the future and share practical tips to make your content the best to thrive in today’s digital world.

So, let’s get started!

What are Traditional Crawlers? Traditional Crawlers

The old-fashioned crawlers, namely Googlebot and Bingbot are based on the following principles, scan, copy and index. They operate similar to librarians and index the information by use of HTML structures, metadata, and keywords.

    • Process: They search links, analyze code, and store page information in huge search databases.
    • Reliability: Suits well with static web sites and organized content.
    • Weakness: Problems with changing websites, with dynamic components, such as JavaScript-bulky applications, and subtle context.

As an example, a traditional crawler might not pick up the product information in a product page when it rewrites the class names or changes the structure of the product page, causing indexing errors. This has led the industry to smarter and AI-assisted means.

What Are AI Crawlers?

AI Crawlers

Intelligent crawlers go beyond bot to be more of an interpreter. Through the use of natural language processing (NLP), computer vision, and machine learning, they are able to comprehend content in a manner that can replicate human understanding.

    • Context Awareness: AI crawlers do not only read the text; however, they define meaning, tone, and purpose.
    • Flexibility: AI crawlers will be able to identify and retrieve suitable information even when a site alters the structure of the site.
    • Multimedia Intelligence: They are capable of processing video, audio and picture, and are therefore much more intelligent than bots that are rule-based.

Just think of a crawler that does not just read a blog post but knows whether it is a product review, a thought-leadership article or a how-to guide. This is the hope of AI-support crawling.

The Rising Dominance of Googlebot.

According to recent stats from Cloudflare, Googlebot is still dominating although AI crawlers are on the rise. Googlebot grew by 96 percent in May 2024-May 2025, with highs in April 2025 of 145 percent of the traffic of May 2024.

This spike was accompanying the introduction of AI Overviews by Google, which added generative answers to search results. The combination of old-style crawling with the use of AI improvements is the future of Google as the hybrid is establishing preconditions of the coexistence of the two systems.

How Does Traditional Search Work?

To value the changes, one should go back to the way the search engines used to operate:

Crawling/ Indexing– Robots search through internet sites and archive copies of pages on servers.

Ranking Algorithms– The ranking of pages depends on the relevance of the key words, back links and the freshness of the content.

Displayed Results– The Results display ads, organic links, snippets, and panels.

AI-Driven Search: A New Era

AI based search engines extend past keywords. They can:

    • Know natural language – responding to complex conversational questions.
    • Provide direct responses – eliminating the necessity to browse through several results.
    • Individualize findings – customize suggestions according to the behavior of the user.
    • Manipulate multimedia – The analysis of videos and podcasts, as well as voice recognition.

ChatGPT, Google Gemini, and Microsoft Copilot are the members of Large Language Models that can transform the search into a conversation instead of a list of search results.

AI Crawlers vs Traditional Crawlers: Key Differences

1. Understanding User Intent

Traditional Crawlers: Search query by a key word and scratch the surface without necessarily realizing what the query entails.

AI Crawlers: This is the next level, whereby the search engine goes beyond the keyword and interprets user intent, semantics and context to deliver even more useful information.

2. Scalability and Efficiency

Traditional Crawlers: Are able to construct a mass of data, but they can create duplicates or irrelevant records as they are not very aware of the context.

AI Crawlers: Smart filtering and prioritization of content, which creates a leaner and more efficient indexing which is more relevant.

3. Real-Time Adaptation

Traditional Crawlers are not good at keeping up with new structure of websites or newer technologies being introduced and thus require manual updating.

AI Crawlers): Learn and adapt in real time and recognize patterns and evolve without human interaction.

4. Content Depth and Quality

Traditional Crawlers– These are typically employed to access visible text and links, and they might not be concerned with multimedia, user-created and interactive content.

The AI crawlers use multimedia, dynamic content and even sentiment to produce a more refined view of the entire quality of pages.

Sharing Quick Wins for Crawlability

Technical SEO is essential even with the further development of AI. The following are fast fixes to increase crawlability:

Important pages should be served with server-side rendering (SSR).

    • Keep HTML lean, semantic and clean.
    • Enhance page speed- sluggish sites are conquered.
    • Provide clear, descriptive headings and titles (H1 -H3).
    • Blocking AI crawlers in robots.txt or llms.txt is not advisable.
    • Publicize verifiable factual, well formatted and prompt information.

Conclusion: Preparing for the Future of Indexing

The future of search lies at the intersection of traditional and AI crawling. While rule-based crawlers remain essential, AI-powered crawlers bring a new level of intelligence, adaptability, and context awareness.

For brands, this means rethinking SEO strategies and embracing AI Optimization (AIO) alongside Generative Engine Optimization (GEO). By preparing content for AI-driven indexing today, businesses can ensure long-term visibility, authority, and discoverability in tomorrow’s search ecosystem.

Stay updated with all the latest blog topics, here with us!

Recommended For You:

What are your Alternatives if your website has been hit by Google penalty?



Source link